Presented papers written in English and published in the Conference proceedings will be submitted for posting to IEEE Xplore.
|Autors of DC VIS 2016 are kindly asked to prepare presentations up to max 10min.
|Chair: Davor Davidović
|Distributed Computing and Cloud computing
|V. Xhafa (University of Prishtina, Prishtina, Kosovo), F. Dika (University of Vienna, Vienna, Austria)
Parameters that affect the parallel execution speed of programs in multi-core processor computers
The speed of the computers work depends directly on the number of processor cores used in the parallel execution of the programs. But, there are some other parameters that have impact on the computer speed, like maximum allowed number of threads, size of cache memory inside the processors and its organization. To determine the impact of each of these parameters, we have experimented with different types of computers, measuring the speed of their work during solving a certain problem and during image processing. To compare impacts of particular parameters on the computers speed, the results we have presented graphically, through joint diagrams for each parameter.
|M. Petrova-El Sayed (Jülich Supercomputing Centre Forschungszentrum Jülich GmbH, Jülich, Germany), K. Benedyczak, A. Rutkowski (Interdisciplinary Center for Mathematical and Computational Modelling Warsaw University, Warsaw, Poland), B. Schuller (Jülich Supercomputing Centre Forschungszentrum Jülich GmbH, Jülich, Germany)
Federated Computing on the Web: the UNICORE Portal
As modern science requires modern approaches, vast collaborations comprising federated resources on heterogeneous computing systems rise to meet the current scientific challenges. Due to their size and complexity these computing systems become demanding and could further complicate the scientific process. As a result, scientists are overrun with the necessity of additional technical experience that lies outside their domain.
UNICORE is a middleware which serves as an abstraction layer to mask the technical details and ensure an easy and unified access to data and computation over federated infrastructures. The Portal is the newest client in the UNICORE portfolio providing web access to data and computing systems. With the rising demand of having an up-to-date user-friendly graphical interface and access to computational resources and data from any device at any point in time, the Portal meets the contemporary needs of the dynamic world wide web.
This paper describes the functionality of the Portal and its advantages over other clients. It also discusses security and authentication methods offered by the Portal and presents use cases and client customization from our practice. It concludes with ideas about future work and extension points for further simplification for the scientific use.
|E. Nepovinnykh, G. Radchenko (South Ural State University, Chelyabinsk, Russian Federation)
Problem-Oriented Scheduling of Cloud Applications: PO-HEFT Algorithm Case Study
Today we see a significantly increased use of a problem-oriented approach to the development of cloud computing environment scheduling algorithms. There are already several such algorithms. However, a lot of these require that the tasks within a single job are independent and do not account for the execution of each task and the volume of data transmitted. We propose a list-based algorithm of a problem-oriented planning of execution of applications in a cloud environment that considers the applications' execution profiles. It provides payroll algorithm for the problem-oriented scheduling applications in the cloud environments based on their computing profiles. Scheduling on the basis of lists suggests prioritization of computing tasks and running in blocks to perform according to the obtained priorities. The proposed approach allows us to take into account the costs of the transfer of data between nodes, thereby reducing the total runtime of the workflow. The proposed algorithm is based on an algorithm of Heterogeneous Earliest-Finish-Time (HEFT), but contains modifications in a calculation of a node level objectives and takes into account the cost of incoming communications of its parent task.
|P. Brezany (University of Vienna, Faculty of Computer Science, Vienna, Austria), T. Ludescher, T. Feilhauer (University of Applied Sciences, Department of Computer Science, Dornbirn, Austria)
Towards a Novel Infrastructure for Conducting High Productive Cloud-Based Scientific Analytics
The life-science and health care research environments offer an abundance of new opportunities for improvement of their efficiency and productivity using big data in collaborative research processes. A key component of this development is e-Science analytics, which is typically supported by Cloud computing nowadays. However, the state-of-the-art Cloud technology does not provide an appropriate support for high-productivity e-Science analytics. In this paper, we show how productivity of Cloud-based analytics systems can be increased by (a) supporting researchers with integrating multiple problem solving environments into the life cycle of data analysis, (b) parallel code execution on top of multiple cores or computing machines, (c) enabling safe inclusion of sensitive datasets into analytical processes through improved security mechanisms, (d) introducing scientific dataspace–a novel data management abstraction, and (e) automatic analysis services enabling a faster discovery of scientific insights and providing hints to detect potential new topics of interests. Moreover, an appropriate formal productivity model for evaluating infrastructure design decisions was developed. The result of the realization of this vision, a key contribution of this effort, is called the High-Productivity Framework that was tested and evaluated using real life-science application domain addressing breath gas analysis applied e.g. in the cancer treatment.
|T. Dancheva, M. Gushev, V. Zdraveski, S. Ristov (Ss. Cyril and Methodius University, Faculty of Computer Science and Engineering, Skopje, Macedonia)
An OpenMP Runtime Profiler/Configuration Tool for Dynamic Optimization of the Number of Threads
This paper describes the implementation and the experimental results of a tool for dynamic configuration of the number of threads used in the OpenMP environment based on the current state of the runtime at the time of the call. For this purpose, we use a mix of profiling and machine learning techniques to determine the number of threads. The decision to set the number of threads is made at the time of the call.
The proposed approach is designed to be cost effective in the scenario of a highly dynamic runtime primarily when running relatively long tasks consisting of a number of parallel constructs.
|E. Djebbar, G. Belalem (University of Oran 1, Oran, Algeria)
An Effective Task Scheduling Strategy in Multiple Data Centers in Cloud Scientific Workflow
Cloud computing is currently the most hyped and popular paradigm in the domain of distributed computing. In this model, data and computation are operated somewhere in a cloud which is some collection of data centers owned and maintained by a third party. Scheduling is the one of the most prominent activities that executes in the cloud computing environment. The goal of cloud task scheduling is to achieve high system throughput and to allocate various computing resources to applications. The Complexity of the scheduling problem increases with the size of the task and becomes highly difficult to solve effectively. In this research, we propose a task scheduling strategy for Cloud scientific workflows based on gang scheduling in multiple Data centers. The experimentation shows the performance of the proposed strategy on response time and average cost of cloudlets.
|A. Ristovski , A. Guseva (Innovation dooel, Skopje, Macedonia), M. Gusev, S. Ristov (Ss. Cyril and Methodius University, Skopje, Macedonia)
Visualisation in the ECG QRS Detection Algorithms
Digital ECG data analysis is a trending concept in the field where applied computer science and medicine coincide. Therefore, in order to meet the requirements that arise, our R&D team has created an environment where developers can test different approaches in data processing. To assist the objective, the platform offers a number of features with the following main target goals: 1) to increase the effectiveness in conducting a proper medical diagnose, 2) to incorporate a unified format of storing the results of the diagnosis conducted and 3) to test various ECG QRS detection algorithms.
|D. Alagić (NTH Media, Varaždin, Croatia), K. Arbanas (Paying Agency for Agriculture, Fisheries and Rural Development, Zagreb, Croatia)
Analysis and Comparison of Algorithms in Advanced Web Clusters Solutions
Today's websites (applications) represent an essential part of nearly every business system, therefore it is unacceptable for them to be unavailable due to the ever-increasing competition on the global market. Consequently, such systems are becoming more and more complex to allow for their high availability. To achieve a higher system availability, a greater scalability is used in creating the so-called Web farms or Web clusters. This system requires much more computer power than the traditional solutions. Since such systems are very expensive and complex in nature, the question is how to obtain the best possible results with the least amount of investment. To achieve that, it is necessary to take a look at the component which contains the information on the request or traffic amount, and that is the HTTP/HTTPS load balancer. The system is based on several algorithms, however, there are no comprehensive analyses that indicate which algorithm to use, depending on the Web cluster and the expected amount of traffic. For this reason, this paper provides a detailed comparison of several frequently used algorithms in several different Web cluster scenarios, i.e. loads. Also, examples are given as to when to use a certain algorithm.
|D. Alagić (NTH Media, Varaždin, Croatia), D. Maček (UniCredit S.p.A. Zweigniederlassung, Wien, Austria)
Metamodeling as an Approach for Better Computer Resources Allocation in Web Clusters
Constant changes are inherent to information technology, the proof of which can be found in many challenges, such as the business sector’s cloud computing. Because of the recent economic and financial crisis companies are forces to provide as best results as possible with the least amount of cost. The main issue of cloud computing are the computer resources (i.e. compute power). Due to their less than optimal usage, computer resources generate high-level costs. The term cloud computing encompasses a wide range of technologies, therefore this paper will focus only on Web clusters. It will describe the issue of improper use of resources in these systems and its main cause, as well as provide suggestions for further optimization. Additionally, the paper will present a new concept of HTTP / HTTPS traffic analysis which should enable a more efficient usage of computing resources. The concept has emerged from metamodeling of two methods. The first one is method for distribution of HTTP / HTTPS requests, and the other one is the AHP method which isused for the classification and prioritizing requirements so that the entire Web cluster could operate as best as possible with the least amount of resources.
|T. Davitashvili (I. Vekua Institute of Applied mathematics of Tbilisi State University, Tbilisi, Georgia), N. Kutaladze (Georgian Hydro-meteorological Department, Tbilisi, Georgia), R. Kvatadze (Georgian Research and Educational Networks Association , Tbilisi, Georgia), G. Mikuchadze (Georgian Hydro-meteorological Department, Tbilisi, Georgia), Z. Modebadze, I. Samkharadze (Iv. Javakhishvili Tbilisi State University, Tbilisi, Georgia)
Showers Prediction by WRF Model above Complex Terrain
In the present article we have configured the nested grid WRF v.3.6 model for the Caucasus region, taking into consideration geographical-landscape character, topography heights, land use, soil type, temperature in deep layers, vegetation monthly distribution, albedo and others. Computations were performed using Grid system GE-01-GRENA with working nodes (16 cores+, 32GB RAM on each) located at the Georgian Research and Educational Networking Association (GRENA) which had been included in the European Grid infrastructure. Therefore it was a good opportunity for running model on larger number of CPUs and storing large amount of data on the Grid storage elements. Two particulate cases of unexpected heavy showers which took place on 13th of June 2015 in Tbilisi and 20th of August 2015 in Kakheti (eastern Georgia) were studied. Simulations were performed by two set of domains with horizontal grid-point resolutions of 6.6 km and 2.2 km. The ability of the WRF model in prediction precipitations with different microphysics and convective scheme components taking into consideration complex terrain of the Georgian territory was tested. Some results of the numerical calculations performed by WRF model are presented.
|I. Sidorov (Matrosov Institute for System Dynamics and Control Theory of Siberian Branch of Russian Academy of , Irkutsk, Russian Federation)
Methods and Tools to Increase Fault Tolerance of High-Performance Computing Systems
This work is devoted to the development of model, methods and tools to increase fault tolerance of high-performance computing systems. The described models and methods based on the automatic diagnostics of the basic software and hardware components of these systems, the use of automatic localization and correction of faults, the use of automatic HPC-system reconfiguration mechanisms.
|A. Feoktistov (Matrosov Institute for System Dynamics and Control Theory of Siberian Branch of Russian Academy of S, Irkutsk, Russian Federation), I. Sidorov (Matrosov Institute for System Dynamics and Control Theory of Siberian Branch of Russian Academy of , Irkutsk, Russian Federation)
Logical-Probabilistic Analysis of Distributed Computing Reliability
The aim of the study is to develop the tools of increasing the reliability of the applied problem solving in heterogeneous distributed computing systems by applying the diagnostics of computing resources components and using the analysis of circuits for solving these problems. A particular attention is paid to the calculation of reliability of the circuit for solving the problem on the basis of a logical-probabilistic method. This method is based on the transition from Boolean functions for description of reliability of the circuit to the probability functions for determining indicators of such reliability. Improving the reliability of the circuit for solving the problem is carried out by a resource reservation. The resource reservation provides the maximally approximate indicator of computational process reliability to a predetermined criterion of reliability, taking into account limitations on the number of allocated reserve resources. The example of creating the circuits for solving problems and calculating their reliability is represented.
|D. Sušanj, D. Arbula (Tehnički fakultet, Rijeka, Croatia)
Distributed Graph Reduction Algorithm with Parallel Rigidity Maintenance
Precise localization in wireless sensor networks depends on distributed algorithms running on a large number of wireless nodes with reduced energy, processing and memory resources. Prerequisite for calculation of unique relative locations of nodes is network graph rigidity property. To enable this prerequisite network graph should be well connected, but execution of distributed algorithm in the graphs with large number of edges can present significant impact on scarce resources of wireless nodes. In order to reduce network graph, novel distributed algorithm for network reduction is proposed. Main objective of the proposed algorithm is to remove as many edges as possible maintaining graph rigidity property, and in this paper, a special case of graph rigidity is considered, namely the parallel rigidity.
|Y. Watashiba (Nara Institute of Science and Technology, Nara, Japan), S. Date (Osaka University, Osaka, Japan), H. Abe (University of Tsukuba, Ibaraki, Japan), K. Ichikawa (Nara Institute of Science and Technology, Nara, Japan), Y. Kido (Osaka University, Osaka, Japan), H. Yamanaka, E. Kawai (National Institute of Information and Communications Technology, Tokyo, Japan), S. Shimojo (Osaka University, Osaka, Japan)
Architecture of Virtualized Computational Resource Allocation on SDN-enhanced Job Management System Framework
Nowadays, users’ computation requests to a high-performance computing (HPC) environment have been increasing and diversifying for requiring large-scale simulations and analysis in the various science fields. In order to efficiently and flexibly handle such computation requests, resource allocation of the virtualized computational resources on an HPC cluster system such as Cloud Computing service is attracting attention. Currently, we aim to realize a novel resource management system (RMS) that enable to handle various resources of an HPC cluster system, and have been studying and developing the SDN-enhanced Job Management System (JMS) Framework, which can manage an interconnect as network resources by integrating Software Defined Networking (SDN) concept into a traditional JMS. However, the current SDN-enhanced JMS Framework cannot allocate virtualized computational resources to a job because the computational resource management is performed by the mechanism of a traditional JMS. In this paper, we propose a mechanism to handle virtualized computational resources on the SDN-enhanced JMS Framework. This mechanism enables to deploy virtual machines (VMs) requested by the user to the computing nodes allocated to a job and execute job’s processes in the VMs.
| Coffe break 15min
|S. Girtelschmid (Inst. for Application Oriented Knowledge Processing, JKU Linz, Linz, Austria), A. Salfinger (Dept. of Cooperative Information Systems, JKU Linz, Linz, Austria), B. Pröll (Inst. for Application Oriented Knowledge Processing, JKU Linz, Linz, Austria), W. Retschitzegger, W. Schwinger (Dept. of Cooperative Information Systems, JKU Linz, Linz, Austria)
Near Real-time Detection of Crisis Situations
When disaster strikes, be it natural or man-made, the immediacy of notifying emergency professionals is critical to be able to best initiate a helping response. As has social media become ubiquitous in the recent years, so have affected citizens become fast reporters of an incident. However, wanting to exploit such ‘citizen sensors’ for identifying a crisis situation comes at a price of having to sort, in near real-time, through vast amounts of mostly unrelated, and highly unstructured information exchanged among individuals around the world. Identifying bursts in conversations can, however, decrease the burden by pinpointing an event of potential interest. Still, the vastness of information keeps the computational requirements for such procedures, even if optimized, too high for a non-distributed approach. This is where currently emerging, real-time focused distributed processing systems may excel. This paper elaborates on the possible practices, caveats, and recommendations for engineering a cloud-centric application on one such system. We used the distributed real-time computation system Apache Storm and its Trident API in conjunction with detecting crisis situations by identifying bursts in a streamed Twitter communication. We contribute a system architecture for the suggested application, and a high level description of its components’ implementation.
|M. Kozlovszky, L. Kovács, K. Batbayar, Z. Garaguly (Obuda University, Budapest, Hungary)
Automatic Protocol Based Intervention Plan Analysis in Healthcare
Evidence and protocol based medicine decreases the complexity and in the same time also standardizes the healing process. Intervention descriptions moderately open for the public, and they differ more or less at every medical service provider. Normally patients are not much familiar about the steps of the intervention process. There is a certain need expressed by patients to view the whole healing process through intervention plans, thus they can prepare themselves in advance to the coming medical interventions. Intervention plan tracking is a game changer for practitioners too, so they can follow the clinical pathway of the patients, and can receive objective feedbacks from various sources about the impact of the services. Resource planning (with time, cost and other important parameters) and resource pre-allocation became feasible tasks in the healthcare sector. The evolution of consensus protocols developed by medical professionals and practitioners requires accurate measurement of the difference between plans and real world scenarios. To support these comparisons we have developed the Intervention Process Analyzer and Explorer software solution. This software solution enables practitioners and healthcare managers to review in an objective way the effectiveness of interventions targeted at health care professionals and aimed at improving the process of care and patient outcomes.
|Ž. Jeričević (Engineering Faculty, RIJEKA, Croatia), I. Kožar (Civil Engineering Faculty, RIJEKA, Croatia)
Using Fourier and Hartley Transform for Fast, Approximate Solution of Dense Linear Systems
The solution of linear system of equations is one of the most common tasks in scientific computing. For a large dense systems that requires prohibitive number of operations of the order of magnitude n^3, where n is the number of equations and also unknowns. We developed a novel numerical approach for finding an approximate solution of this problem based on Fourier or Hartley transform although any unitary, orthogonal transform which concentrates power in a small number of coefficients can be used. This is the strategy borrowed from digital signal processing where pruning off redundant information from spectra or filtering of selected information in frequency domain is the usual practice. The procedure is to transform the linear system along the columns and rows to the frequency domain, generating a transformed system. The least significant portions in the transformed syst
|N. Mikuličić, Ž. Mihajlović (Sveučilište u Zagrebu, Fakultet elektrotehnike i računarstva, Zagreb, Croatia)
Procedural Generation of Mediterranean Environments
This paper describes an overall process of procedural generation of natural environments through terrain generation, texturing and scattering of terrain cover. Although described process can be used to create various types of environments, focus of this paper has been put on Mediterranean which is somewhat specific and has not yet received any attention in scientific papers. We present a novel technique for procedural texturing and scattering of terrain cover based on cascading input parameters. Input parameters can be used to scatter vegetation simply by slope and height of the terrain, but they can also be easily extended and combined to use more advanced parameters such as wind maps, moisture maps, per plant distribution maps etc. Additionally, we present a method for using a satellite image as an input parameter. Comparing results with real-life images shows that our approach can create plausible, visually appealing landscapes.
|H. Rostamzadeh Hajilari, M. Talebi, M. Sharifi (Iran University of science and Technology, Tehran, Iran)
Energy-aware Power Management of Virtualized Multi-core Servers through DVFS and CPU Consolidation
Considerable energy consumption of datacenters results in high service costs beside environmental pollutions. Therefore, energy saving of operating data centers received a lot of attention in recent years. In spite of the fact that modern multi-core architectures have presented both power management techniques, such as dynamic voltage and frequency scaling (DVFS), as well as per-core power gating (PCPG) and CPU consolidation techniques for energy saving, the joint deployment of these two features has been less exercised. Obviously, by using chip multiprocessors (CMPs), power management with consideration of multi-core chip and core count management techniques can offer more efficient energy consumption in environments operating large datacenters. In this paper, we focus on dynamic power management in virtualized multi-core server systems which are used in cloud-based systems. We propose an algorithm which is effectively equipped by power management techniques to select an efficient number of cores and frequency level in CMPs within an acceptable level of performance. The paper also reports an extensive set of experimental results found on a realistic multi-core server system setup by RUBiS benchmark and demonstrates energy saving up to 67% compared to baseline. Additionally it outperforms two existing consolidation algorithms in virtualized servers by 15% and 21%.
|W. Ni, Y. Gao (College of Physics and Information Engineering, Fuzhou University, Fuzhou,China, Fuzhou, China), Ž. Lučev Vasić (Faculty of Electrical Engineering and Computing, University of Zagreb, Zagreb, Croatia, Zagreb, Croatia), S. Pun (Department of Electrical and Electronics Engineering, University of Macau, Macau SAR, China, Macau SAR, China), M. Cifrek (Faculty of Electrical Engineering and Computing, University of Zagreb, Zagreb, Croatia, Zagreb, Croatia), M. Vai (Department of Electrical and Electronics Engineering, University of Macau, Macau SAR, China, Macau SAR, China), M. Du (College of Physics and Information Engineering, Fuzhou University, Fuzhou,China, Fuzhou, China)
Human Posture Detection Based on Human Body Communication with Muti-carriers Modulation
Multi-node sensors for human posture detection, by acquiring kinematic parameters of the human body, helps to further study the laws of human motion.It can be used as a reference for the quantitative analysis tool for some specific applications such as healthcare, virtual reality, sports training and military affairs, etc. Compared with the traditional optical method, posture detection analysis based on the inertial sensors has smaller limitation of space, lower cost, and easier implementation. In this paper, a human posture detection system was introduced. Utilizing the parameter data obtained by the inertial sensors, three-dimensional angles of the human’s hand movement could be calculated via quaternion algorithm for data fusion. The angles data transmission among the sensor nodes was successfully realized by the human body communication(HBC) transceiver based on capacitive coupling of multi-carriers OOK modulation at the data rate of 57.6kbps. Then the human posture could be reconstructed on the PC host. Ultimately, the implementation of the overall results showed the feasibility of the system.
|O. Zaikin, S. Kochemazov, A. Semenov (Matrosov Institute for System Dynamics and Control Theory of Siberian Branch of Russian Academy of S, Irkutsk, Russian Federation)
SAT-based Search for Systems of Diagonal Latin Squares in Volunteer Computing Project SAT@home
In this paper we considered the problem of finding pairs of mutually orthogonal diagonal Latin squares of order 10. First we reduced it to Boolean satisfiability problem (SAT). The obtained instance is very hard, therefore we decomposed it into a family of subproblems. To solve the latter we used the volunteer computing project SAT@home. In the course of 9-month long computational experiment we managed to find 26 pairs of described kind, that are different from already known pairs. Also we considered the problem of search for triples of diagonal Latin squares of order 10 that satisfy weakened orthogonality condition. Using diagonal Latin squares from known pairs we found new triples of proposed kind. During this computational experiment we used a computing cluster.
|Chair: Davor Davidović, Ruđer Bošković Institute, Croatia
|Distributed Computing and Cloud computing
| Professional presentation
|S. Petrus (EBSCO Information Services, Prague, Czech Republic)
The Role of IEEE Literature in Patented Innovation
| Invited Lecture
|Z. Šojat, K. Skala (Ruđer Bošković Institute, Centre for informatics and Computing, Zagreb, Croatia)
Views on the Role and Importance of Dew Computing in the Service and Control Technology
Modern day computing paradigms foster for a huge community of involved participants from almost the entire spectrum of human endeavour. For computing and data processing there are individual Computers, their Clusters, Grids, and, finally, the Clouds. For pure data communication there is the Internet, and for the Human-understandable Information Communication for example the World Wide Web. The rapid development of hand-held mobile devices with high computational capabilities and Internet connectivity enabled certain parts of Clouds to be "lowered" into the so called "thin clients". This led to development of the Fog-Computing Paradigm as well as development of the Internet of Things (IoT) and Internet of Everything (IoE) concepts.
However, the most significant amount of information processing all around us is done on the lowest possible computing level, outright connected to the physical environment and mostly directly controlling our human immediate surroundings. These "invisible" information processing devices we find in our car's motor, in the refrigerator, the gas boiler, air-conditioners, wending machines, musical instruments, radio-receivers, home entertainment systems, traffic-controls, theatres, lights, wood-burning stoves, and ubiquitously all over the industry and in industrial products. These devices, which are neither at the cloud/fog edge, nor even at the mobile edge, but rather at the physical edge of computing are the basis of the Dew Computing Paradigm.
The merits of seamlessly integrating those "dew" devices into the Cloud - Fog - Dew Computing hierarchy are enormous, for individuals, the public and industrial sectors, the scientific community and the commercial sector, by bettering the physical and communicational, as well as the intellectual, immediate human environment.
In the possibility of developing integrated home management/entertainment/maintenance systems, self-organising traffic-control systems, intelligent driver suggestion systems, coordinated building/car/traffic pollution control systems, real-time hospital systems with all patient and equipment status and control collaborating with the medical staff, fully consistent synaesthetic artistic performances including artists and independent individuals ("active public") from wide apart, power distribution peek filtering, self-reorganisation and mutual cooperation systems based on informed behaviour of individual power consumption elements, emergency systems which cooperate with the town traffic, etc., etc., the Dew-Computing paradigm shows the way towards the Distributed Information Services Environment (DISE), and finally towards the present civilisation's aim of establishment of a Global Information Processing Environment (GIPE).
It is therefore essential, through Research, Innovation and Development, to explore the realm of possibilities of Dew Computing, solve the basic problems of integration of the "dew" level with the higher level Dew-Fog-Cloud hierarchy, with special attention to the necessity of information (not only data) processing and communication, and demonstrate the viability and high effectiveness of the developed architecture in several areas of human endeavour through real life implementations. The present scientific and technological main objective is to provide the concepts, methods and proof-of-concept implementations that are moving Dew Computing from a theoretical/experimental concept to a validated technology. Finally, it will be necessary to define and standardise the basics of the Dew Computing Architecture, Language and Ontology, which is a necessity for the seamless integration of the emerging new Global Information Processing Architecture into the Fog and Cloud Paradigms, as a way towards the above mentioned civilisation goals.
|E. Afgan (Ruder Boskovic Institute, Zagreb, Croatia), A. Lonie (University of Melbourne, Melbourne, Australia), J. Taylor (Johns Hopkins University, Biology department, Baltimore, United States), K. Skala (Ruder Boskovic Institute, Zagreb, Croatia), N. Goonasekera (University of Melbourne, Melbourne, Australia)
Architectural Models for Deploying and Running Virtual Laboratories in the Cloud
Running virtual laboratories as software services in cloud computing environments requires numerous technical challenges to be addressed. Domain scientists using those virtual laboratories desire powerful, effective and simple-to-use systems. To meet those requirements, these systems are deployed as sophisticated services that require a high level of autonomy and resilience. In this paper we describe a number of deployment models based on technical solutions and experiences that enabled our users to deploy and use thousands of virtual laboratory instances.
|M. Telenta, L. Kos (University of Ljubljana, Ljubljana, Slovenia)
A CAD Service for Fusion Physics Codes
There is an increased need for coupling machine descriptions of various fusion physics codes. We present a CAD service library that interfaces geometrical data requested by physics codes in completely programmatic way for use in scientific workflow engines. Fusion codes can request CAD geometrical data at different Levels Of Details (LOD) and control major assembly parameters. This service can be part of the scientific workflow that delivers meshing of the CAD model and/or variation of the parameters. In this paper we present re-engineering of the ITER tokamak using open source CAD kernel providing standalone library of services Modeling of the machine is done with several LOD, starting from the rough one and building/replacing with more detailed models by adding more details and features. Such CAD modelling of the machine with LODs delivers flexibility and data provenance records for the complete CAD to physics codes workflow chain.
|M. Kolman, G. Kosec (Jožef Stefan Institute, Ljubljana, Slovenia)
Correlation between Attenuation of 20 GHz Satellite Communication Link and Liquid Water Content in the Atmosphere
The effect of Liquid Water Content, i.e. the mass of the water per volume unit of the atmosphere, on the attenuation of a 20 GHz communication link between a ground antenna and communication satellite is tackled in this paper. The wavelength of 20 GHz electromagnetic radiation is comparable to the droplet size, consequently the scattering plays an important role in the attenuation. To better understand such a system a correlation between measured LWC and attenuation is analysed and compared to different models, e.g. Mie scattering model and Marshall-Palmer statistical model. The LWC is usually estimated from the pluviograph rain rate measurements that captures only spatially localized and ground level information about the LWC. In this paper the LWC is extracted from the reflectivity measurements provided by a 5.6 GHz weather radar situated in Lisca, Slovenia. The radar measures reflectivity in 3D and therefore a precise spatial dependency of LWC along the communication link is considered. The attenuation is measured with an in-house receiver Ljubljana Station SatProSi 1 that communicates with a geostationary communication satellite ASTRA 3B on the 20 GHz band.
| Coffe break 20min
|D. Grozev, M. Shopov, N. Kakanakov (Technical University of Sofia, branch Plovdiv, Plovdiv, Bulgaria)
Practical Implementation of Private Cloud with traffic optimization
This paper presents a practical implementation of a private cloud, based on VMware technology, optimized to support CoS and QoS (even when overlay technology like VXLAN is used) in a field of smart metering in electrical power systems and IoT. The use of cloud computing technologies increase reliability and availability of the system. All routing, firewall rules and NATs are configured using NSX. Implementation of CoS and QoS in virtual and physical network will guarantee necessary bandwidth for normal operation among other virtualized services.
|P. Zinterhof (University Salzburg, Salzburg, Austria)
Improving Data Locality for NUMA-Agnostic Numerical Libraries
An increasing number of today's server systems are based on the NUMA paradigm which offers scalable computing ressources
within a single OS image. The obstacles in getting very high performance from such systems are well known and programming techniques and libraries have been created to alleviate the situation. While several libraries (e.g. likwid, hwloc) support proper pinning of threads to certain cpu cores, relatively little support is available for the placement of allocated RAM. Using legacy libraries or 'black box'-code with no support of NUMA architectures potentially worsens the situation and can lead to sub-optimal application performance.
In this paper we propose a new way of optimizing data locality in multi-threaded applications, which is a key factor in obtaining high performance on NUMA systems. Our proposal is based on the first-touch policy that the Linux kernel employs for placing memory pages on different NUMA domains, but other than standard malloc calls we aim for fine grained - per page - decisions. These decisions are derived iteratively by benchmarking single kernels or even the application itself with varying page distributions. After an initial setup phase the process is fully automatic and even applicable without prior knowledge of the details of the computational kernel in question.
|A. Jovic, D. Kukolja (University of Zagreb Faculty of Electrical Engineering and Computing, Zagreb, Croatia), K. Jozic (INA - industrija nafte, d.d., Zagreb, Croatia), M. Cifrek (University of Zagreb Faculty of Electrical Engineering and Computing, ZAGREB, Croatia)
Use Case Diagram Based Scenarios Design for a Biomedical Time-Series Analysis Web Platform
Biomedical time-series analysis deals with detection, classification and prediction of subjects' states and disorders. In this paper, we present requirements and scenarios of use for a novel web platform designed to analyze multivariate heterogeneous biomedical time-series. The scenarios of use are described with the corresponding UML Use Case Diagrams. We also discuss some architectural and technological issues, including parallelization, visualization and feature extraction from biomedical time-series. The goal of this paper is to present what we currently consider as the best approach for design of such a system, which may also be beneficial for similar biomedical software systems. The paper is focused on design and architectural considerations only, as implementation of the complex system has only just begun.
|M. Antonijević, S. Sučić, H. Keserica (Končar-KET, Zagreb, Croatia)
Augmented Reality for Substation Automation by Utilizing IEC 61850 Communication
IEC 61850 standard represents the most commonly used communication technology for new substation automation projects. Despite the fact that IEC 61850 provides a semantic data model and a standardized configuration description, these facts are underutilized in substation automation management today. This is specifically illustrated in the data visualization domain where new technologies such as virtual and augmented reality have reached significant maturity levels and have not been used for IEC 61850 system visualization so far.
In this paper IEC 61850 features have been combined with augmented reality technologies for providing added value visualization capabilities in substation automation domain. The developed prototype demonstrates proof-of-concept solution for regular substation automation checks and maintenance activities
|J. Brozek, M. Jakes, V. Svoboda (University of Pardubice, Pardubice, Czech Republic)
Innovation of the Campbell Vision Stimulator with the Use of Tablets
The article covers three fundamental themes: a) performance solutions using gaming to treat multiple eye defects; in particular - Amblyopia; b) an explanation of the issue and design of the software (including games) which is intended for therapeutic or health purposes; and c) highlighting the modern solutions and the power of software products for the needs of the health sector, in particular in the fields of diagnostics and rehabilitation.
The reader can learn basic information about eye diseases and the principles of their treatment, and become acquainted with the reasons why computer games (and in particular video games) are appropriate for rehabilitation.
Very important and beneficial for the reader is the section of the article which focuses on a) the differences in the design of standard software and software designed for the needs of the healthcare system, b) the high risks associated with defects of all software, or even the risk of side effects with the so-called „perfect“ software, c) the fact that a major part of software development does not comply with all of the standards.
The article also discusses the advantages of the software solution over other methods of rehabilitation. Most of the paradigms are generally applicable.
Familiarity with the principles of this application can thus be interesting even for developers in the relevant areas.
|A. Bánáti (Óbuda University, Budapest, Hungary), P. Kacsuk (MTA SZTAKI, Budapest, Hungary), M. Kozlovszky (Óbuda University, Budapest, Hungary)
Classification of Scientific Workflows Based on Reproducibility Analysis
In the scientist’s community one of the most vital challenges is the issue of reproducibility of workflow execution. The necessary parameters of the execution (we call them descriptors) can be external which depends on for example the computing infrastructure (grids, clusters and clouds), third party resources or it can be internal which belong to the code of the workflow such as variables. Consequently, during the process of re-execution these parameters may change or become unavailable and finally they can prevent to reproduce the workflow. However in most cases the lack of the original parameters can be compensated by replacing, evaluating or simulating the value of the descriptors with some extra cost in order to make it reproducible. Our goal in this paper is to classify the scientific workflows based on the method and cost how they can become reproducible.
|E. Kail (Budapest Óbuda University, Budapest, Hungary), J. Kovács (MTA SZTAKI, Budapest, Hungary), M. Kozlovszky (Óbuda University, Budapest, Hungary), P. Kacsuk (MTA SZTAKI, Budapest, Hungary)
Dynamic Execution of Scientific Workflows in Cloud
Scientific workflows have emerged in the past decade as a new solution for representing complex scientific experiments. Generally, they are data and compute intensive applications and may need high performance computing infrastructures (clusters, grids and cloud) to be executed. Recently, cloud services have gained widespread availability and popularity since their rapid elasticity and resource pooling, which is well suited to the nature of scientific applications that may experience variable demand and eventually spikes in resource. In this paper we investigate dynamic execution capabilities, focused on fault tolerance behaviour in the Occopus framework which was developed by SZTAKI and was targeted to provide automatic features for configuring and orchestrating distributed applications (so called virtual infrastructures) on single or multi cloud systems.
|P. Škoda, B. Medved Rogina (Ruđer Bošković Institute, Zagreb, Croatia)
FPGA Kernels for Classification Rule Induction
Classification is one of the core tasks in machine learning and data mining. One of several classification models are classification rules, which use a set of if-then rules to describe the model. In this paper we present a set of three FPGA-based compute kernels for accelerating classification rule induction. The kernels were implemented and evaluated for use cases common in sequential covering algorithms for rule induction. Kernels clocked at 160 MHz achieved speedup up to 4.55× compared to reference software implementation executed on CPU clocked at 3.2 GHz.
| Chair: Karolj Skala, Ruđer Bošković Institute Croatia
|J. Opiła (AGH University of Science and Technology, Cracow, Poland)
Prototyping of Visualization Designs of 3D Vector Fields Using POVRay Rendering Engine
There is a persistent quest for novel methods of visualization in order to get insight into complex phenomena in variety of scientific domains. Researchers, ex. VTK team, achieved excellent results; however, some problems connected with implementation of new techniques and quality of the final images still persist.
Results of inspection of number of visualization styles of 3D vector field employing POVRay ray-tracing engine are discussed, i.e. hedghogs, oriented glyphs, streamlines, isosurface component approach and texturing design. All styles presented have been tested using water molecule model and compared concerning computing time, informativeness and general appearance. It is shown in the work that Scene Description Language (SDL), domain specific language implemented in POV-Ray is flexible enough to use it as a tool for fast prototyping of novel and exploratory visualization techniques. Visualizations discussed in the paper were computed using selected components of API of ScPovPlot3D, i.e. templates written in the SDL language. Results are compared to designs already implemented in VTK.
|M. Babič, B. Jerman-Blažič (Jožef Stefan Institute, Ljubljana, Slovenia)
New Cybercrime Taxonomy of Visualization of Data Mining Process
Data Mining is the process of identifying new patterns, insights in data and knowledge discovery, and is at the intersection of multiple research areas, including Machine Learning, Statistics, Pattern Recognition, Databases, and Visualization. With the maturity of databases and constant improvements in computational speed, data mining algorithms that were too expensive to execute are now within reach. Data visualization is a general term that describes any effort to help people understand the significance of data by placing it in a visual context. Patterns that might go undetected in text-based data can be exposed and recognized easier with data visualization software. Exploring and analyzing the vast volumes of data becomes increasingly difficult. Information visualization and visual data mining can help to deal with the flood of information. There is a large number of information visualization techniques which have been developed over the last decade to support the exploration of large data sets. In this paper, we propose a classification of information visualization and visual data mining techniques. Fractals and graph theory are very popular in different areas. We develop a new method for estimating fractal dimension for network and new Taxonomy of visualization of data mining process application in Cybercrime activity.
|B. Popovic, A. Balota (Fakultet za informacione tehnologije, Univerzitet Mediteran, Podgorica, Montenegro), D. Strujic (Nilex AB, Helsingborg, Sweden)
Visual Representation of Predictions in Software Development Based on Software Metrics History Data
Software who is being developed for more than a year, is categorized in the group of large projects, and is entering the area of critical and risky in terms of their successful completion. Because of those reasons, they require constant monitoring by the software manager. In order to perform adequate monitoring of developed solution and its projects, there are a number of software metrics that provide these information in numerical form, but very few of them is displayed to software manager in visual form. Predictions for future solution development, and its visual representation, based on historical data gathered from the same system, are usually not included in such systems. These kind of systems must have the ability to predict state of the developed system in the future, and adapt its output upon having analyzed his software metrics history data. In this paper, we propose a intelligent system that will analyze software metrics history data of the solution, and make predictions and visual outputs to support software manager decisions. The results show that using this decision support tool, as a form of intelligent system, helps software managers in their decision making during project management, and in reducing overall project risk.
|I. Prazina, K. Balic, K. Prses, S. Rizvic, V. Okanović (Faculty of Electrical Engineering, Sarajevo, Bosnia and Herzegovina)
Interaction with Virtual Objects in a Natural Way
Digital technologies are an efficient tool for visualization and presentation of the virtual objects. Interaction with the virtual objects for their presentation can be very important, especially in a domain of the presentation of cultural heritage. Implementation of natural interactive online 3D visualization is difficult process. This paper describes the interaction techniques for manipulating 3D objects. The paper gives one solution for natural interaction with virtual objects using Leap Motion as a sensor and WebGL for presenting virtual hand and 3D objects.
|D. Sušanj (Engineering Faculty, RIJEKA, Croatia), G. Gulan (Medical Faculty, RIJEKA, Croatia), I. Kožar (Civil Engineering Faculty, RIJEKA, Croatia), Ž. Jeričević (Engineering Faculty, RIJEKA, Croatia)
Bone Shape Characterization Using the Fourier Transform and Edge Detection in Digital X-Ray Images
From the series of digital X-ray images we extracted bone edges using information entropy based edge detection algorithms. The extracted edges are series of two dimensional coordinates from image space. The space series were tested for shapes of medical interest inside the knee joint. In particular, the existence of straight edges inside the knee joint was tested for by using digital filtering and analytic computation of first and second derivative in the Fourier domain. This kind of analysis was used because it provides rotational invariance for the sought bone shape, and allows the statistical comparison of shapes from different images.
The real life examples are taken from medical practice using X-Ray imaging of series of knee joints in order to illustrate the analysis procedures and their medical relevance for real data.
|M. Kranjac (University of Novi Sad, Faculty of Technical Sciences, Novi Sad, Serbia), U. Sikimić (Singapore Management University, Institute of Innovation & Entrepreneurship, Singapore, Singapore), I. Simić (Provincial secretary for economy, employment and gender equality, Novi Sad, Serbia), M. Paroški (University of Novi Sad, Faculty of technical sciences, Novi Sad, Serbia), S. Tomić (University Nikola Tesla, Faculty for Engineering Management, Beograd, Serbia)
GIS in the e-government platform to enable state financial subsidies data transparency
The authors of the paper present an expanded model of the e-government. The model includes a new function within e-government platform and this is visibility of the state aid activities. Visualization of financial subsidies given by the different state administration levels (national, provincial and local) is presented with the Geographic information systems – GIS tools. Such approach makes the activities of the public administration transparent to all stakeholders and contributes the e-democracy evolution. In the paper is presented pilot attempt of the Provincial government of Vojvodina to implement such an innovative tool into the communication with citizens. The scientific contribution of this paper is to present GIS tool which includes new spatial visibility into the transparency of state activities.
|K. Jakimoski, S. Arsenovski, L. Gorachinova, S. Chungurski, O. Iliev, L. Djinevski, E. Kamcheva (FON University , Skopje, Macedonia)
Evaluation of Caching Techniques for Video on Demand in Named Data Networks
Although the main driver for ICN (Information-Centric Networking) has been the rise in Internet video traffic, interplay between ICN and video is not much researched in the available literature. The type of ICN caching strategies that works well on video workloads as well as how ICN helps in improving the video-centric QoE (Quality-of-Experience) needs more thorough research from the academic community. In this work we research three types of ICN caching strategies or LRU, FIFO and LFU caching strategies and. Detailed evaluation of these three schemes is done in terms of the key performance metrics and important conclusions are obtained. Performance metrics that are researched in the ICN literature include cache hit ratio, throughput and server load. We expansively evaluate in this work a combination of 3 content replacement schemes on 4 different cache sizes of 1GB, 10GB, 100GB, and 1TB. In fact, the focus of our work is to evaluate the above mentioned schemes on video workloads in order to improve the user experience and network performances (lower congestion and lower server load).
| Coffe break 20min
| Chairs: Roman Trobec and Uros Stanic, Institut Jožef Štefan, Slovenia
|A. Badnjevic (1. Verlab Ltd Sarajevo; 2. International Burch University Sarajevo; 3. University of Sarajevo Facult, Sarajevo, Bosnia and Herzegovina), L. Gurbeta (1. Verlab Ltd Sarajevo; 2. International Burch University Sarajevo, Sarajevo, Bosnia and Herzegovina), M. Cifrek (Faculty of Electrical Engineering and Computing Zagreb, University of Zagreb, Zagreb, Croatia), D. Marjanović (International Burch University Sarajevo, Sarajevo, Bosnia and Herzegovina)
Diagnostic of Asthma Using Fuzzy Rules Implemented in Accordance with International Guidelines and Physicians Experience
This paper presents a system for classification of asthma based on fuzzy rules. Fuzzy rules are defined according to Global Initiative for Asthma (GINA) guidelines, as well as through consultations with long-term experience of pulmologists. Our fuzzy system for classification of asthma is based on a combination of spirometry (SPIR) and Impulse Oscillometry System (IOS) test results, which are inputs to fuzzy system. Additionally, the use of bronchodilatation and bronhoprovocation enabled a complete patient’s dynamic assessment rather than a simple static assessment. The system was retroactively tested with 1250 Medical Reports established by pulmologists, out of which 728 were diagnosed with asthma and 522 were healthy subjects. Out of the 728 asthmatic patients 91.89% were correctly classified, while the healthy subjects were classified with 95% accuracy. Sensitivity and specificity were assessed, as well, which were 91.90% and 95.02%, respectively.
|P. Lavrič, M. Depolli (Jožef Stefan Institute, Ljubljana, Slovenia)
Robust Beat Detection on Noisy Differential ECG
The differential ECG measurement is performed by a small gadget attached to a person’s chest. In contrast to the standard 12-channel ECG, which is most often short and measured on a resting subject, the small form factor of the gadget allows for several days long measurement of active subjects. The resulting measurements are not only novel in their time and activity coverage, but they also require novel approach to their analysis. First, ECGs are noisier because the electrodes are placed close together and because measurements are made on physically active subjects. Second, the gadget is not always placed on by a trained professional and can be miss-oriented, producing a wide range of ECG orientations. Finally, the ECGs are measured with low sample ratio which helps conserve battery life of the gadget. In this work, we try to develop the first stage of such differential ECG analysis - algorithms for noise estimation and for beat detection. These novel algorithms have to be able to deal with noisy and sparsely sampled ECGs and with various beat shapes. Furthermore, these algorithms should be able to run in real-time applied to the ongoing measurement and on a very computationally weak portable devices, while maintaining power efficiency. Two algorithms for noise estimation and one for beat detection that fulfill those constraints are presented in this work.
|A. Badnjevic (1. Verlab Ltd Sarajevo; 2. International Burch University Sarajevo; 3. University of Sarajevo Facult, Sarajevo, Bosnia and Herzegovina), L. Gurbeta (1. Verlab Ltd Sarajevo; 2. International Burch University Sarajevo;, Sarajevo, Bosnia and Herzegovina), M. Cifrek (Faculty of Electrical Engineering and Computing Zagreb, University of Zagreb, Zagreb, Croatia), D. Marjanović (International Burch University Sarajevo, Sarajevo, Bosnia and Herzegovina)
Classification of Asthma Using Artificial Neural Network
This paper presents a system for classification of asthma based on artificial neural network.
A total of 1800 Medical Reports were used for neural network training. The system was subsequently validated through the use of 1250 Medical Reports established by pulmologists from hospital Sarajevo. Out of the aforementioned Medical Reports, 728 were diagnoses of asthma, while 522 were healthy subjects. Out of the 728 asthmatics, 97.11% were correctly classified, and the healthy subjects were classified with an accuracy of 98.85%. Sensitivity and specificity were assessed, as well, which were 97.12% and 98.85%, respectively.
Our system for classification of asthma is based on a combination of spirometry (SPIR) and Impulse Oscillometry System (IOS) test results, whose measurement results were inputs to artificial neural network. Artificial neural network is implemented to obtain both static and dynamic assessment of the patient's respiratory system.
|K. Friganović, M. Medved, M. Cifrek (University of Zagreb, Faculty of Electrical Engineering and Computing, Zagreb, Croatia)
Brain-Computer Interface Based on Steady-State Visual Evoked Potentials
Brain computer interface (BCI) can establish communication between human brain and computer independent from normal neuromuscular pathways. This allows giving instructions to computer without the use of standard communication channels, such as mouse or keyboard. This paper describes development of steady-state visual evoked potential (SSVEP) based BCI system. EEG amplifier with one bipolar channel is designed for acquisition of raw EEG data from posterior region of head over the occipitial lobe. Three white LED chessboards with programmable flicker frequencies are used as stimulation to induce different SSVEPs. For feature extraction Fourier transform of autocorrelation of EEG signal is used. In MATLAB, graphical user interface (GUI) application is implemented showing realtime graphs of EEG signal, both in time and frequencies domain. Simple game of turning on and off three light bulbs with looking at different LED chessboard is also implemented in GUI.
|A. Krvavica, Š. Likar (student of Veterinary faculty, University of Ljubljana, Ljubljana, Slovenia), M. Brložnik (PRVA-K, Small Animal Clinic, Ljubljana, Slovenia), A. Domanjko-Petrič (Clinic for small animals and surgery, Veterinary faculty, University of Ljubljana, Ljubljana, Slovenia), V. Avbelj (Department of Communication Systems, Jožef Stefan Institute, Ljubljana, Slovenia)
Comparison of Wireless Electrocardiographic Monitoring and Standard ECG in Dogs
Electrocardiographic (ECG) data obtained by a wireless body electrode attached to the skin and connected to a smart device via low power Bluetooth technology were compared with a standard ECG in 8 dogs. The ECG data were gained from dogs with suspected arrhythmias due to cardiac or systemic diseases. A 2-minute standard ECG has been compared to a 15-minute recording obtained with wireless body electrodes. It has been established that this wireless electrocardiographic monitoring is a sensitive and specific method for identification of heart rates, duration of ECG waves and arrhythmia. When compared to a standard ECG, equivalent results were obtained for the heart rate and duration of different waves. Due to longer recording time the wireless device was more sensitive for documenting arrhythmias. With the wireless body electrodes, the ECG data were obtained while the dogs were lying down, standing or walking. The wireless electrode proved to be reliable and simple to use. This device enables a good option of long-term monitoring of canine cardiac rhythm in realworld environment.
|J. Tasic (University of Ljubljana, Ljubljana, Slovenia), M. Gusev, S. Ristov (Ss. Cyril and Methodius University, Faculty of Computer Science and Engineering, Skopje, Macedonia)
A Medical Cloud
Recent trends and development in production of various wearable bio sensors enable a lot of medical and environment information to be available for each human being. Processing of data coming from these sensors, extracting valuable information and analyzing the medical record with a sufficient expertise is a complex processing task, which requires more resources than an ordinary mobile device or personal computer can perform with available technology today. In this paper, we propose a cloud-based solution to deal with these challenges. A medical cloud hosts a specially developed application, which communicates with medical devices and sensors from one side and a medical institution in case of an alert for detected problems analyzing the current state by received sensor data.
|A. Celesti, M. Fazio, A. Romano, M. Villari (University of Messina, Messina, Italy)
A Hospital Cloud-Based Open Archival Information System for the Efficient Management of HL7 Big Data
Nowadays, the Open Archive Information System (OAIS) model is widely adopted in hospitals to manage data related to both doctors and patients. However, the Archival Storage systems of hospitals are typically based on old Relational DBMS that makes hard the management of patients' data especially in HL7 format. In fact, data have to be continuously parsed in order to be stored in relational databases and sent to other hospital systems. Considering also interoperable scenarios where HL7 data continuously grow, the management of patients' information can become very hard. In this paper, we discuss an OAIS system able to manage HL7 Big Data. In particular, considering HL7 glucose observations in JSON format, we demonstrate that in a scalable scenario an archival storage for big data processing is more convenient for hospitals than traditional archival storage systems.
|Y. Gao, C. Lin ( College of Physics and Information Engineering, Fuzhou University, Fuzhou, China), S. Pun , M. Vai (Department of Electrical and Computer Engineering, Faculty of Science and Technology,University of M, Macao, China), M. Du (College of Physics and Information Engineering, Fuzhou University, Fuzhou, China)
Recognition and Adjustment for Strip Background Baseline in Fluorescence Immuno-chromatographic Detection System
Fluorescence immune-chromatography is a kind of immunoassay which is widely used in quantitative detection area. However, the background baseline drift with relatively large amplitude during test is a huge problem for the overall performance of the quantitative detection system, and the existing background baseline recognition and adjustment methods were based on online or offline data processing, which brings huge computational burden. To solve this problem, this paper proposed a background baseline recognition and adjustment method from the aspect of optical design. It integrated background acquisition, conditioning and control functions which could adjust the photo-electric signal of the background in real time until the sampling value of background baseline was within the threshold. Finally, the peak value of the test strip was detected. To examine the performance of the system, ten standard test strips with different concentrations were selected for background baseline detection. Besides, linearity and coefficient of variation (CV) of the system was also calculated. Results showed that the fluctuation range of baseline was within±4% of the threshold, whereas the CV of the test strip background was 2.57%. Furthermore, although the occupancy rate of background amplitude for A/D unit was decreased from 31.7% to 2.4%, CV of the system was still lower than 3%, which indicates that the system has much wider detection range than other congeneric detection devices.
|S. Vrhovec (University of Maribor, Ljubljana, Slovenia)
Agile Development of a Hospital Information System
Agile software development methods have rapidly spread after their formal introduction in the agile manifesto at the dawn of the millennium. They rapidly gained support in the software industry and can be considered a standard today. Agile methods seem to improve software project success rates and offer developers the needed flexibility to adapt to changing user requirements. However, the use of agile methods has been rarely studied in large-scale information systems development projects. There are even fewer insights in the area of agile healthcare informations systems development. In this paper, we present insights into an agile hospital information system development in an European hospital. Agile development has been studied from various stakeholder perspectives. Results show that physicians, nurses and administration have diverse opinions on agile practices and different practices may be appropriate for interacting with each stakeholder.
|D. Kučak, G. Đambić (University College Algebra - University College for Applied Computer Engineering, Zagreb, Croatia), V. Kokanović (IN2 Data, Zagreb, Croatia)
SOA Based Interoperability Component for Healthcare Information System
In this paper, a new SOA based interoperability component developed for the Health Information System (HIS) project of Special Hospital for Pulmonary Diseases (SHPD) in Croatia is proposed. Component has developed as standalone server which enables communication HIS with other systems like Laboratory Information System (LIS), Radiology Information System (RIS), e-lists, e-ordering etc. Managing complex medical information as well as integrating heterogeneous system in SHPD are solved by introducing HL7 and Web services. Finally, the performance of a communication between HIS and other systems based on interoperability component is analyzed.
|F. Grilec, Ž. Lučev Vasić (Fakultet elektrotehnike i računarstva, Zagreb, Croatia), W. Ni, Y. Gao, M. Du (Fuzhou University, Laboratory of Medical Instrumentation and Pharmaceutical Technology, Fuzhou, China), M. Cifrek (Fakultet elektrotehnike i računarstva, Zagreb, Croatia)
Wireless Intrabody Communication Sensor Node Realized Using PSoC Microcontroller
The aim of this paper is the development of a frequency modulated signal generator and receiver for intrabody communication using PSoC microcontrollers by Cypress. The program is made in Cypress PsoC Creator and PSoC Designer programs for the CY8C27643 – 24PVXI and CY8C5888LTI – LP097 microcontrollers.
Based on the input received from the serial bus, the signal generator synthesizes a FSK signal using digital-to-analog conversion. The signal is transmitted between transmitter and receiver electrodes placed on the human body. The received signal is then demodulated using Low Pass and Band Pass filters and a correlator and sent back to the computer using the serial bus.
|J. Slak, G. Kosec (Jožef Stefan Institute, Department of Communication Systems, Ljubljana, Slovenia)
Detection of Heart Rate Variability from a Wearable Differential ECG Device
The precise beat-to-beat variability is extracted from an ECG signal measured by a wearable sensor that constantly records the heart activity of an active subject for several days. Due to the limited resources of the mobile device heart rate can only be sampled at relatively low, approximately 100 Hz, frequency. Besides low sampling rate the signal from a wearable sensor is also burdened with much more noise than the standard 12-channel ECG, mostly due to the design of the device, i.e. the electrodes are positioned relatively close to each other, and the fact that the subject is active during the measurements. To extract beat-to-beat variability with 1 ms precision, i.e. 10 times more accurate than the sample rate of the measured signal, a two-step algorithm is proposed. In the first step an approximate global search is performed, roughly determining the point of interest, followed by a local search based on the Moving Least Squares approximation to refine the result. The methodology is evaluated in terms of accuracy, noise sensitivity, and computational complexity. All tests are performed on simulated as well as measured data. It is demonstrated that the proposed algorithm provides accurate results at a low computational cost and it is robust enough for practical application.
| Lunch break
|T. Poplas Susič (Health Centre Ljubljana, Ljubljana, Slovenia), U. Stanič (Kosezi d.o.o., Ljubljana, Slovenia)
Penetration of the ICT Technology to the Health Care Primary Sector - Ljubljana PILOT
According to the OECD data, Slovenia has among EU countries the lowest number of subjects that could not afford health care services because of financial or other reasons. Slovenia is in the group of 17 EU countries that assure the health care services to all its citizens and even to immigrants. The comprehensive analysis of WHO also revealed that Slovenia has a good coverage and available primary health care sector. These facts motivated a large national consortium with over hundred stockholders from industry and academia to submit the e-Health and m-Health (EMZ) project proposal on the cohesion funds call as a part of the SPS national system. The ambitious project goals until 2020 are to reach 30% of the national population by ICT supported services with ultimate effects reflected in 10% of lower costs for health care that could result in 2% of the GDP growth and could generate 5000 new work places. The EMZ realization has already started with several project pilots based on the multifunctional body sensor of vital functions. One of the most important is tailored to the Community Health Centre Ljubljana which consists of 7 units. Their advantage is that all of them offer continuous care to the citizens located in their region, healthy and ill, young and elderly and to all daily migrants and visitors of Ljubljana. They are committed to deliver immediate care, when the patients visit them. The pilot located in Ljubljana will offer the responses of medical personnel to the system user interfaces and also the system acceptability level assessed with regard to patients, families and caregivers. Community Health Centre Ljubljana (CHC) is a development oriented institution in primary health care with more than 1400 employees and with more than 440.000 registered patients. They wish to ensure a high-quality and time-wise optimal access to health care services for all of their users in all segments of acting. The patients come from Ljubljana and its periphery with rural areas and are treated within the medical doctrine and defined ethics aspects in. The proposed pilot system could trigger the penetration of the ICT in medicine in the primary care level that could improve and complete the integrated health care with evidence based decisions. Also disable patients that cannot visit their physician could be supervised and treated by ICT. Additionally, the participating patients occasional experiencing difficulties with heart rhythm will be provided with a comprehensive care in terms of screening with preventive and curative treatments, and diagnostic. Initial results of the study including the acquired measurements and system efficiency are presented and discussed in the paper.
|A. Šerifović-Trbalić (University of Tuzla, Faculty of Electrical Engineering, Tuzla, Bosnia and Herzegovina), A. Trbalić (Drvodom doo, Tuzla, Bosnia and Herzegovina)
Image-based Metal Artifact Reduction in CT Images
The presence of metal in the scanning field of a CT scanner can create so-called metal artifacts in the reconstructed images. These streak artifacts obscure information about anatomical structures, making it difficult for radiologists to correctly interpret the images or for computer programs to analyze them. The several methods have been proposed in the literature based on the reconstruction of the missing/corrupted projection data using the raw data directly from the tomographs. This paper proposes an image-based strategy consisting of the image registration and morphological reconstruction. A preliminary results on the CT images from different CT scanners and patients are presented and discussed.
|V. Jazbinšek (Institute of Mathematics, Physics and Mechanics, Ljubljana, Slovenia)
New Algorithm for Automatic Determination Of Systolic and Diastolic Blood Pressures in Oscillometric Measurements
Most automated non-invasive blood pressure measuring devices are based on some empirically derived criteria applied to the oscillometric index, which is defined as a certain characteristic physical property of pressure pulses measured by an inflated cuff wrapped around the upper arm during cuff pressure deflation. In this study, different algorithms for automatic determination of the systolic and diastolic pressures are compared. In addition to the measured pressure pulses, which are one of a typical physical property used for the oscillometric index, some other properties are applied such as a time derivative and a powered short time variance of the pressure data, as well as the audible part of the data measured by a microphone implanted in the cuff (Korotkoff sounds). Beside known empirical algorithms based on characteristic ratios of the oscillometric pulse amplitude, and the maximum and minimum slope of the oscillometric envelope, a new algorithm is introduced based on the presence and absence of oscillometric activity. This algorithm can be applied either to the powered short time variance of the pressure changes in the cuff or to the Korotkoff sounds.
|K. Križanović (FER, Zagreb, Croatia), M. Marinović (Atos IT Solutions and Services d.o.o., Zagreb, Croatia), A. Bulović (Humboldt-Universität, Berlin, Germany), R. Vaser, M. Šikić (FER, Zagreb, Croatia)
TGTP-DB – a Database for Extracting Genome, Transcriptome and Proteome Data Using Taxonomy
An enormous potential of metagenomics has caused a significantly increased research activity in recent years. However, discovering pathogens present in a metagenomics sample is still inadequately accurate and very time consuming. Narrowing down a list of potential pathogens would significantly decrease the referent database size and thus improve both accuracy and speed. While publicly available databases enable free access to all their data, the interface is generally unwieldy and impractical for quickly producing concrete reference datasets, making the process either very time consuming or requiring significant programming skills. Relatively simple yet sufficiently powerful tools allowing the users to quickly extract desired reference datasets
Results: In this paper we present a set of scripts for downloading publicly available data on taxonomy, genomes, transcriptomes and proteomes and loading it into a relational database. This enables the user to use SQL and other standard relational database tools to search and extract specific dataset. The power and ease of use of the TGTP-DB is demonstrated with a search engine that allows the user to search the taxonomy tree for a desired group of organisms, and download files containing genomes, transcriptomes and proteomes of the organisms in the group.
|A. Badnjevic (1. Verlab Ltd Sarajevo; 2. International Burch University Sarajevo; 3. University of Sarajevo Facult, Sarajevo, Bosnia and Herzegovina), L. Gurbeta (1. Verlab Ltd Sarajevo; 2. International Burch University Sarajevo;, Sarajevo, Bosnia and Herzegovina)
Development and Perspectives of Biomedical Engineering in South East European Countries
In respect with growing population, chronical disease management and aging population, the medicine and health care have drastically changed over the past decades. Engineering became more involved in medicine resulting in development of new discipline, Biomedical Engineering.
As new engineering solution for problems in medicine regarding therapy, diagnosis and treatment have emerged, the need for new interdisciplinary educational programmes evolved.
Today, South East countries are focused on developing new educational programmes in Biomedical Engineering, following the models established in other European countries, and recognizing the new discipline – Biomedical Engineering as a professional discipline.
International Federation for Medical and Biological Engineering, supported by UN, gives support for national societies that are focused on Biomedical Engineering.
In this article, overview of development and perspectives of Biomedical Engineering in South East European (SEE) countries is given, with focus on Bosnia and Herzegovina.
|A. Rashkovska, D. Kocev, R. Trobec (Jožef Stefan Institute, Ljubljana, Slovenia)
Clustering of Heartbeats from ECG Recordings Obtained with Wireless Body Sensors
Long-term electrocardiographic (ECG) recordings are intended to help detect heart diseases. A wireless multi-function biosensor that measures a potential difference between two proximal electrodes on the skin enables monitoring of vital functions - heart activity and respiration. It can thus make long-term ECG measurements while the subject is performing his everyday duties and activities. These measurements are significantly longer and heterogeneous than the measurements performed at a controlled hospital environment. Consequently, manual inspection of these recordings in order to identify different groups/clusters of heartbeats (that can be used for better describing the health status of the subject) is a tedious, hard and expensive job. In this paper, we propose a method for automatic clustering of heartbeats from an ECG obtained with a wireless body sensor. First, we investigate different signal pre-processing and segmentation techniques. Second, we evaluate state-of-the-art data mining methods for time series clustering such as k-means with Euclidean distance, cosine similarity and dynamic time warping, as well as predictive clustering trees for multi-target prediction. The obtained clusters will then be analysed and used to derive meaningful heartbeat categories.
|M. Mohorčič, M. Depolli (INSTITUT JOŽEF STEFAN, LJUBLJANA, Slovenia)
Heart Rate Analysis with NevroEkg
NevroEkg is a computer application for analysis of ECG and related bio-signals, such as breathing and blood pressure. It was made in collaboration between computer scientists, engineers and neurocardiologists. Recently, it has been modified to also support the unconventional measurements of differential ECG, made on wireless wearable gadgets. These wearable gadgets measure ECG a bit differently – with lower resolution, lower sampling frequency, and more noise. These features require modified and additional processing of the ECG signal, which is not required for standard 12-channel ECGs. A novel algorithm is proposed to help the human operator handle beat-detection in novel ECG measurements.
Karolj Skala (Croatia), Roman Trobec (Slovenia), Uroš Stanič (Slovenia)
Enis Afgan (Croatia), Almir Badnjevic (Bosnia and Herzegovina), Piotr Bala (Poland), Leo Budin (Croatia), Jelena Čubrić (Croatia), Borut Geršak (Slovenia), Simeon Grazio (Croatia), Gordan Gulan (Croatia), Yike Guo (United Kingdom), Ladislav Hluchy (Slovakia), Željko Jeričević (Croatia), Peter Kacsuk (Hungary), Aneta Karaivanova (Bulgaria), Zalika Klemenc-Ketiš (Slovenia), Charles Loomis (France), Ludek Matyska (Czech Republic), Željka Mihajlović (Croatia), Damijan Miklavčič (Slovenia), Tonka Poplas Susič (Slovenia), Laszlo Szirmay-Kalos (Hungary), Tibor Vámos (Hungary), Matjaž Veselko (Slovenia), Yingwei Wang (Canada)
Registration / Fees:
REGISTRATION / FEES
|Price in EUR
|Before 16 May 2016
|After 16 May 2016
|Members of MIPRO and IEEE
|Students (undergraduate and graduate), primary and secondary school teachers
Rudjer Boskovic Institute
HR-10000 Zagreb, Croatia
DC VIS PPT presentation repository
PPT presentations can be uploaded at: http://dc-vis.irb.hr
DC VIS Video repository
Video lectures can be found at:https://www.youtube.com/channel/UC2C4Htp03gxRNX73A06UexA
The best papers will get a special award.
Accepted papers will be published in the ISBN registered conference proceedings. Papers presented at the Conference will be submitted for posting to IEEE Xplore.
Authors of outstanding papers will be invited to submit the extended version of their papers to a special issue of Scalable Computing: Practice and Experience (ISSN 1895-1767) published in the first quarter of 2017.
International Program Committee General Chair:
Petar Biljanović (Croatia)
International Program Committee:
Slavko Amon (Slovenia), Vesna Anđelić (Croatia), Michael E. Auer (Austria), Mirta Baranović (Croatia), Almir Badnjevic (Bosnia and Herzegovina), Bartosz Bebel (Poland), Ladjel Bellatreche (France), Eugen Brenner (Austria), Andrea Budin (Croatia), Željko Butković (Croatia), Željka Car (Croatia), Matjaž Colnarič (Slovenia), Alfredo Cuzzocrea (Italy), Marina Čičin-Šain (Croatia), Marko Delimar (Croatia), Todd Eavis (Canada), Maurizio Ferrari (Italy), Bekim Fetaji (Macedonia), Tihana Galinac Grbac (Croatia), Paolo Garza (Italy), Liljana Gavrilovska (Macedonia), Matteo Golfarelli (Italy), Stjepan Golubić (Croatia), Francesco Gregoretti (Italy), Stjepan Groš (Croatia), Niko Guid (Slovenia), Yike Guo (United Kingdom), Jaak Henno (Estonia), Ladislav Hluchy (Slovakia), Vlasta Hudek (Croatia), Željko Hutinski (Croatia), Mile Ivanda (Croatia), Hannu Jaakkola (Finland), Leonardo Jelenković (Croatia), Dragan Jevtić (Croatia), Robert Jones (Switzerland), Peter Kacsuk (Hungary), Aneta Karaivanova (Bulgaria), Mladen Mauher (Croatia), Igor Mekjavic (Slovenia), Branko Mikac (Croatia), Veljko Milutinović (Serbia), Vladimir Mrvoš (Croatia), Jadranko F. Novak (Croatia), Jesus Pardillo (Spain), Nikola Pavešić (Slovenia), Vladimir Peršić (Croatia), Tomislav Pokrajcic (Croatia), Slobodan Ribarić (Croatia), Janez Rozman (Slovenia), Karolj Skala (Croatia), Ivanka Sluganović (Croatia), Vlado Sruk (Croatia), Uroš Stanič (Slovenia), Ninoslav Stojadinović (Serbia), Jadranka Šunde (Australia), Aleksandar Szabo (Croatia), Laszlo Szirmay-Kalos (Hungary), Davor Šarić (Croatia), Dina Šimunić (Croatia), Zoran Šimunić (Croatia), Dejan Škvorc (Croatia), Antonio Teixeira (Portugal), Edvard Tijan (Croatia), A. Min Tjoa (Austria), Roman Trobec (Slovenia), Sergio Uran (Croatia), Tibor Vámos (Hungary), Mladen Varga (Croatia), Marijana Vidas-Bubanja (Serbia), Boris Vrdoljak (Croatia), Damjan Zazula (Slovenia)
Opatija, with its 170 years long tourist tradition, is the leading seaside resort of the Eastern Adriatic and one of the most famous tourist destinations on the Mediterranean. With its aristocratic architecture and style Opatija has been attracting renowned artists, politicians, kings, scientists, sportsmen as well as business people, bankers, managers for more than 170 years.
The tourist offering of Opatija includes a vast number of hotels, excellent restaurants, entertainment venues, art festivals, superb modern and classical music concerts, beaches and swimming pools and is able to provide the perfect response to all demands.
Opatija, the Queen of the Adriatic, is also one of the most prominent congress cities on the Mediterranean, particularly important for its international ICT conventions MIPRO that have been held in Opatija since 1979 gathering more than a thousand participants from more than forty countries. These conventions promote Opatija as the most desirable technological, business, educational and scientific center in Southeast Europe and the European Union in general.
For more details please look at www.opatija.hr/ and www.opatija-tourism.hr/.
|Currently there are no news