Search  English (United States) Hrvatski (Hrvatska)

innovative promotional partnershipArtificial Intelligence towards EU Multilingualism

Technical co-sponsorship

 

The IEEE reserves the right to exclude a paper from distribution after the conference (including its removal from IEEE Explore) if the paper is not presented at the conference.

Autors of DC VIS 2017 are kindly asked to prepare presentations up to max 10 min. 

 

Event program
Thursday, 5/25/2017 9:00 AM - Friday, 9/1/2017 12:00 AM,
Camelia 2, Grand hotel Adriatic, Opatija
Invited Lectures 
9:00 AM - 9:30 AMLech Jóźwiak,  Eindhoven University of Technology, The Netherlands

Embedded Computing Technology for Advanced Mobile Cyber-physical Systems 

9:30 AM - 9:45 AMN. Megill (Boston Information Group, Lexington MA 02420, United States), M. Pavicic (Institut Rudjer Boskovic, Zagreb, Croatia)
New Classes of Kochen-Specker Contextual Sets 
Finding Kochen-Specker contextual sets proves to be essential for quantum information and quantum computation in particular. It is therefore essential to find algorithms and programs which can generate arbitrary Kochen-Specker sets in a nearly-exhaustive manner. In this paper we present such generations for two new classes of Kochen-Specker sets. All sets from one of the classes are completely invisible to standard algorithms and programs from the literature as well as the upper part of sets from the second class. We also describe the methods and programs we used to obtain the sets on supercomputing clusters.
Distributed Computing and Visualization 
9:45 AM - 10:00 AMM. Babič, B. Jerman Blažič (Jožef Stefan Institute, Ljubljana, Slovenia)
New Method for Determination Complexity Using in AD HOC Cloud Computing 
The paper present new method for determination complexity of using in new hyper-hybrid AD HOC cloud computing. A large body of research has been devoted to identifying the complexity of structures in networks. The study of complex networks is a young and active area of scientific research inspired largely by the empirical study of real-world networks such as computer networks and social networks. Networks are a ubiquitous way to represent complex systems, including those in the social and economic sciences. Understanding complexity and broad systems-level frameworks in the life, physical and social sciences has turned, in recent decades, to issues of network dynamics. It has been shown that many real complex networks share distinctive characteristic properties that differ in many ways from the random and regular networks. We present some challenges problem of AD HOC network. In this paper we introduce a new method for quantifying the complexity of a new hyper hybrid AD HOC cloud network based on presenting the nodes of the network in Cartesian coordinates, converting to polar coordinates, and calculating the fractal dimension. Our results suggest that this approach can be used to determine the complexity for any type of network.
10:00 AM - 10:15 AMV. Lekic (Heilmeyersteige 98, 89075 Ulm, Germany), Z. Babic (Faculty of Electrical Engineering, Banja Luka, Bosnia and Herzegovina)
Neneta: Heterogeneous Computing Complex-Valued Neural Network Framework 
Due to increased demand for computational efficiency for the training, validation and testing of artificial neural networks, many open source softwjavascript:__doPostBack('dnn$ctr379$ViewAuthorPaperEdit$TabContainer1$TabPanel1$LinkButton7','')are frameworks have emerged. Almost exclusively GPU programming model of choice in such software frameworks is CUDA. Symptomatic is also lack of the support for complex-valued neural networks. With our research going exactly in that direction, we developed and made publicly available yet another heterogeneous software framework, completely based on C++ and OpenCL standards with which we try to solve problems we identified with already existing solutions.
10:15 AM - 10:30 AMM. Brcic, N. Hlupic (University of Zagreb Faculty of Electrical Engineering and Computing, Zagreb, Croatia)
Cloud-Distributed Computational Experiments for Combinatorial Optimization 
The development of optimization algorithms for combinatorial problems is a complicated process, both guided and validated by the computational experiments over the different scenarios. Since the number of experiments can be very large and each experiment can take substantial execution time, distributing the load over the cloud speeds up the whole process significantly. In this paper we present the system used for experimental validation and comparison of stochastic combinatorial optimization algorithms, applied in the specific case of project scheduling problems.
10:30 AM - 10:45 AMF. Serbet, T. Kaya, M. Ozdemir (Firat University , Elazig, Turkey)
Design of Digital IIR Filters Using Particle Swarm Optimization 
The paper aims to establish a solution methodology for the optimal design of digital Infinite Impulse Response (IIR) filter by integrating the features of Particle Swarm Optimization (PSO). PSO is a method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality to mathematical formula over the particle's position and velocity. The applied method explores the search space locally as well as globally and uses them in the mathematical formulas. The optimal design of IIR filter is realized with the result obtained with PSO.
10:45 AM - 11:00 AMS. Stoyanov, N. Kakanakov (Technical University of Sofia, Plovdiv branch, Plovdiv, Bulgaria)
Big Data Analytics in Electricity Distribution Systems 
Many problems in power distribution systems affecting today’s technological equipment are often generated locally within a facility from any number of situations, such as local construction, heavy startup loads, faulty distribution components, and even typical background electrical noise. Penetration of advanced sensor systems such as advanced metering infrastructure (AMI), high-frequency overhead and underground current and voltage sensors have been increasing significantly in power distribution systems over the past few years. To manage the massive amounts of data generated from smart meters and other components of the grid, utility companies need a solution such as e.g. Apache Hadoop ecosystem that operates in a distributed manner rather than using the centralized computing model. The paper aims to discuss a solution for easily discovering of problems with power quality that have local origin which collects data from AMI and implements distributed computing across clusters of computers using simple programming models.
11:00 AM - 11:15 AMD. Tomic, Z. Car, D. Ogrizovic (Rijeka University, Rijeka, Croatia)
Running HPC Applications on Many Million Cores Cloud 
Despite the various hardware and software improvements in Cloud architecture, there still exists the huge performance gap between the commodity supercomputers and Cloud when running HPC communication intensive applications. In order to find what is preventing them to better scale on Cloud, we evaluated HPL and NAMD benchmarks on HPE Openstack testbed, and NAMD benchmarks on supercomputer located at Rijeka University Supercomputing Center. Our results revealed two major bottlenecks: the throughput of the Cloud interconnect, and Cloud orchestration layer, among other responsible for the management of the communication between Cloud instances. We also investigated the influence of jittering, but did not find the significant influence on performance. Our conclusion is that by solely increasing the interconnect throughput, one will not be able to improve the scalability of HPC communication intensive HPC applications in the Cloud. This is also backed up by the research from other authors: with NAMD benchmarks on Open Cirrus, Eucalyptus and Taub Cloud performed at HP Labs, and with HPL benchmark performed at San Diego Supercomputing Center. We propose two possible scenarios of scalability improvements. One would be to develop the distributed model of Cloud Orchestration layer; another to use bare metal containers, thus avoiding the Cloud orchestration layer at all. Lastly, efficient load balancing of HPC load, especially in hybrid Clouds, remains the must if we want to see HPC applications scaling over many million Cloud cores. For this, we propose novel SLEM based load balancing strategy, already proven to be highly efficient on heterogeneous clusters.
11:15 AM - 11:30 AMI. Astrova (Tallinn University of Technology, Tallinn, Estonia), A. Koschel, C. Eickemeyer, J. Kersten, N. Offel (University of Applied Sciences and Arts, Hannover, Germany)
DBaaS Comparison 
Cloud-based computing holds promise for businesses that no longer need to own and maintain costly fixed-hardware assets. Seeing the benefits of lower costs and scale-as-needed capability, businesses are pursuing the cloud in ever increasing numbers. Database-as-a-Service (DBaaS) offerings are increasingly attractive to IT professionals. Many DBaaS offerings are relational database management system (RDBMS) requiring the use of SQL, but some are based on alternative technologies like noSQL databases. In this paper, we look at both. In particular, we select DBaaS offerings from Amazon and Microsoft (viz., Amazon RDS, Amazon DynamoDB, Azure SQL Database and Azure DocumentDB) and parse out similarities and differences in their features, including user interface, backup and restore capability, security, maintenance, database language richness, scaling and availability. We also discuss pricing.
11:30 AM - 11:45 AM Coffe break 
11:45 AM - 12:00 PMK. Besedin, P. Kostenetskiy (South Ural State University, Chelyabinsk, Russian Federation)
Modeling Heterogeneous Computational Cluster Hardware in Context of Parallel Database Processing 
Mathematical modeling is an important approach for creating a parallel database management system that could efficiently use capabilities, provided by modern heterogeneous computational clusters, equipped with manycore coprocessors or GPUs. To this day several models heterogeneous computational systems were proposed, but none of them is suited for modeling database processing. In this paper, we address this problem by proposing the Heterogeneous Database Multiprocessor Model. Proposed model consists of several submodels which describe different aspects of database processing on heterogeneous computational systems. This paper describes hardware platform submodel, which describes the hardware of modeled computational cluster and execution submodel that describes the rules of cooperation between hardware submodel components.
12:00 PM - 12:15 PMV. Golodov (South Ural State University (national research university), Chelyabinsk, Russian Federation)
Properties of Mathematical Number Model Provided Exact Computing 
Paper describes the author experience in identification of computer arithmetics implementation principles that allows to extend mathimatical properties of the number on its computer representation. The range of the representable numbers, bitwise identity of the result on the different computer architectures and in parallel computation are very important in the current cloud and parallel computing era. Implementation issues are shown on the example of rational number representation but implementation procedure may be used for any other number representation.
12:15 PM - 12:30 PMP. Kostenetskiy (South-Ural State University, Chelyabinsk, Russian Federation)
Simulation of the Parallel Database Column Coprocessor 
The paper proposes a mathematic model that allowing exploration of effectiveness of different hardware cluster computing systems configurations based on multi-core coprocessors while processing databases using approach of distributed columnar indices.
12:30 PM - 12:45 PMB. von St. Vieth, J. Rybicki (Forschungszentrum Juelich GmbH, Juelich, Germany), M. Brzezniak (Poznan Supercomputing and Networking Center (PSNC), Poznan, Poland)
Towards Flexible Open Data Management Solutions 
Many data sharing initiatives emerged in the recent time, with various driving factors behind this phenomena subsumed under the terms Open Data and Open Research. The most prosaic one is the fact that more and more funding agencies require sharing of project results. Sharing enables the verification of obtained results by facilitating repeatability and reproducibility. Also economical aspects are important: Data reuse can reduce the research costs, and redoing of the same experimental work can be avoided (provided sufficient visibility of earlier results). Open Data is also contributing to researcher's visibility and reputation. The implementation of data sharing is, however, quite challenging. It demands both storage technologies and services that can deal with the increasing amounts of data in an efficient and cost-effective way while remaining user-friendly and future-proof. We describe the architecture of a data management solution that is able to provide a scalable, extensible, yet cost-effective storage engine with several replication mechanisms. For the sake of user-friendliness and high data visibility, the solution has a mechanism to create flexible namespaces for a efficient organization, searching, and retrieving of the data. Such namespaces can be individual tailored to the researchers' needs, potential facilitating data reuse across community borders. The presented architecture embodies our experiences gathered in the EU-funded project EUDAT2020.
12:45 PM - 1:00 PMM. Kranjac (University of Novi Sad, Faculty of Technical Sciences, Novi Sad, Serbia), U. Sikimić (Singapore Management University, Institute of Innovation & Entrepreneurship, Singapore, Singapore), J. Salom (University of Novi Sad, Faculty of Technical Sciences, Novi Sad, Serbia), S. Tomić (University Nikola Tesla, Faculty for Engineering Management, Beograd, Serbia)
Spatial Analysis of Clustering Process  
Creation o of industry clusters is a request of modern economy to shape itself due to permanently changing external and internal conditions. Transition countries, like Serbia are in position to act in foreign markets with united offer of products to reach critical mass and enter new markets. Governmental support to clustering process is inevitable and it is start to be important when the mapping of market begins and when the clustering territories start to appear. Government of Vojvodina has recognized importance of clusters and started to give financial aid and political support to first clusters in Vojvodina in 2007. After 9 years of activities there are still problems and need for changes and improvements. There is need to analyze clustering process with spatial components. The authors of this paper present visualization of clusters activities, reached results and reflection of governmental support on clustering success. It is performed by use of visualisation tool QGIS. They have found very big differences between clusters, big changes during clusters' life cycles and found dependence of these factors on sectors of work. The authors discuss clustering process, recognize various stage of clusters development and suggest some visualization tools which should be used to follow clustering process.
Thursday, 5/25/2017 3:00 PM - Friday, 9/1/2017 12:00 AM,
Camelia 2, Grand hotel Adriatic, Opatija
3:00 PM - 4:00 PMDistributed Computing and Visualization 
3:00 PM - 3:15 PMI. Prazina, V. Okanovic, K. Balic, S. Rizvic (Faculty of Electrical Engineering, Sarajevo, Bosnia and Herzegovina)
Usage of Android Device in Interaction with 3D Virtual Objects 
Implementation of natural interactive online 3D visualization is difficult process. To keep people interested in virtual heritage the new ways of immersive interaction must be developed. This paper presents a solution for natural interaction with 3D objects in virtual environments using Android device over WiFi. It describes the principles and concepts of functioning of individual components as well as the principles of operation after the integration of all components into a single software solution.
3:15 PM - 3:30 PML. Kunić, Ž. Mihajlović (University of Zagreb Faculty of Electrical Engineering and Computing, Zagreb, Croatia)
Generating 3D Guitar Strings Using Scripts 
Artists who create 3D models usually rely on the traditional method of direct mesh manipulation using basic operations such as translation, rotation, scaling, and extrusion. In some cases, creating a model in this manner requires performing many repetitive and precise actions, which makes a fully manual approach suboptimal. This paper explores an alternative concept of 3D modeling using scripts, which aims to automate parts of the modeling process. Existing procedural algorithms use scripts to generate objects based on mathematical models (e.g. generating realistic terrains using fractals), or to build complex structures using simple template models (Model Synthesis). Unlike those methods, which rely on stochastic behavior to generate pseudo-randomized shapes, the goal of this method is to write scripts that generate parametric objects which would be either difficult or time-consuming to model by hand. The example described in this paper is a script that generates animated strings for musical instruments. Although the basic shape of a string is quite simple, animating string vibrations and bending can be quite a tedious task, especially because of the various shapes and sizes of string instruments.
3:30 PM - 3:45 PMA. Sabou, D. Gorgan (Technical University of Cluj-Napoca, Cluj-Napoca, Romania)
Remote Interactive Visualization for Particle-based Simulations on Graphics Clusters 
Particle-based models are widely spread in the field of Computer Graphics, and mainly used for real-time simulations of soft deformable bodies. However, simulations including high-resolution models have a great computational cost and, when adding the need for real-time rendering and interaction, they fall way outside the range of applications that traditional computing architectures can accommodate. Graphics clusters can offer the raw computing power needed for such simulations but, due to their physical design and operating mode, introduce a series of challenges that must be overcome, such as efficient distributed rendering and remote visualization and interaction with the simulated scenes. This paper presents a solution to interactive visual particle-based simulations on graphics clusters using an optimized in-situ distributed rendering approach which, coupled with state-of-the-art remote visualization and interaction techniques and tools, provide efficient means for highly scalable interactive simulations.
3:45 PM - 4:00 PMP. Lavrič, C. Bohak, M. Marolt (University of Ljubljana, Ljubljana, Slovenia)
Collaborative View-Aligned Annotations in Web-Based 3 Medical Data Visualization 
The paper presents our web-based 3D medical data visualizationframework with emphasis on user collaboration. The framework supports visualization of volumetric data and 3D meshes in web browsers. The paper focuses on integration of user-shareable 3D view-aligned hand drawn or written annotations into the visualization framework. Annotations are created on separate transparent canvases which are aligned with selected views. View parameters are part of annotations and can be shared with other users over the network. Our implementation allows for real-time sharing of annotations during creation. Annotations from the same or different users can be overlaid within the same view. Annotations were implemented through adaptation of the framework’s rendering pipeline, which allows for combining multiple visualization layers into a unified final render. View aligned annotations were added in addition to text annotations pinned to 3D locations on the displayed model. In theframework, users can list through all annotations,where by upon selection of a 3D view aligned annotation the camera is positioned according to the stored parameters and the annotation is displayed.
4:00 PM - 8:15 PMBiomedical Engineering 
4:00 PM - 4:15 PMS. Sprager (University of Ljubljana, Ljubljana, Slovenia), R. Trobec (Jozef Stefan Institute, Ljubljana, Slovenia), M. Juric (University of Ljubljana, Ljubljana, Slovenia)
Feasibility of Biometric Authentication Using Wearable ECG Body Sensor Based on Higher-Order Statistics 
Besides its principal purpose in the field of biomedical applications, ECG can also serve as a biometric trait due to its unique identity properties, including user-specific deviations in ECG morphology and heart rate variability. In this paper, we exploit the possibility to use long-term ECG data acquired by unobtrusive chest-worn ECG body sensor during daily living for accurate user authentication and identification. Therefore, we propose a novel framework for wearable ECG-based user recognition. The core of the framework is based on the approach that employs higher-order statistics on cyclostationary data, already efficiently applied for inertial-sensor-based gait recognition. Experimental data was collected by four subjects during their regular daily activities with more than 6 hours of ECG data per subject and then applied to the proposed framework. Preliminary results (equal error rate from 6% to 13%, depending on the experimental parameters) indicate that such authentication is feasible and reveal clear guidelines towards future work.
4:15 PM - 4:30 PMM. Mohorčič, M. Depolli (Jozef Stefan Institute, Ljubljana, Slovenia)
Bio-Medical Analysis Framework 
Monitoring of ECG signal is used in medicine for multiple purposes. Measurements can be taken at any stage of patient’s medical care, either as a preventative, diagnostical or recovery monitoring. Current wearable technology enables users and doctors to produce so far unprecedented amount of information. Processing of such measurements is usually a laborious and time consuming manual task. Automatic processing of such measurements is neither well defined nor thoroughly tested. In this work we focused on the needs of both, health care professionals, and IT engineers developing software for processing of long term multi-sensor measurements. Taking into account future expandability of multi-sensor gadgets, we propose a new framework, which is able to show the data measured by wearable ECG monitor, process it, and compare algorithms for automatic processing. We determined possible signal sources along with their values, units and time continuity. We propose suitable file formats for storage of such measurements, keeping in mind future expandability, size demands and usage of the formats. Suggested framework can therefore be used to display, automatically process and store discrete and continuous biomedical signals beside ECG, producing additional value to gathered measurements.
4:30 PM - 4:45 PMM. Machado, J. Pereira, M. Silva, R. Fonseca-Pinto (Instituto de Telecomunicações, Leiria, Portugal)
Finding a Signature in Dermoscopy: a Color Normalization Proposal 
Digital image methodologies related with Melanoma has become in the past years a major support for differential diagnosis in skin cancer. Computer Aided Diagnosis (CAD) systems, encompassing image acquisition, artifact removal, detection and selection of features, highlight Machine Learning algorithms as a novel strategy towards a digital assisted diagnosis in Dermatology. Although the central role played by color in dermoscopic image assessment, Machine Learning algorithms mainly use texture and shape features, derived from gray level images, obtained from the true color images of the skin. Since the acquisition conditions are key for the color characterization and thus, central for the quantification of different colors in a dermoscopic image, this work presents a strategy for color normalization, joint with its use in the calculation of the number of colors of a dermoscopic image. This methodology will contribute to the uniformity in the use of color features extracted from different datasets in CAD systems (acquired by distinct dermoscopes) possibly presenting distinct illumination characteristics. This normalization proposal can also be applied as an image preprocessing step, aimed to achieve higher scores in the standard metrics in ML algorithms.
4:45 PM - 5:00 PMR. Fonseca-Pinto, M. Machado (Instituto de Telecomunicações, Leiria, Portugal)
A Textured Scale-Based Approach to Melanocytic Skin Lesions in Dermoscopy 
Melanoma is the most dangerous and lethal form of human skin cancer and the early detection is a fundamental key for its successful management. In recent years the use of automatic classification algorithms in the context of Computer Aided Diagnosis (CAD) systems have been an important tool, by improving quantification metrics and also assisting in the decision regarding lesion management. This paper presents a novel and robust textured-based approach to detect melanomas among melanocytic images obtained by dermoscopy, using Local Binary Pattern Variance (LBPV) histograms after the Bidimensional Empirical Mode Decomposition (BEMD) scale-based decomposition methodology. The results show that it is possible to develop a robust CAD system for the classification of dermoscopy images obtained from different databases and acquired in diverse conditions. After the initial texture-scale based classification a postprocessing refinement is proposed using reticular pattern and color achieving to 97.83, 94.44 and 96.00 for Sensitivity, Specificity and Accuracy.
5:00 PM - 5:15 PMJ. Opiła, T. Pełech-Pilichowski (AGH University of Science and Technology, Cracow, Poland)
Remarks on Visualization of Fuzziness of Cardiac Data 
In contemporary health science sophisticated apparatus delivers a lot of data on vital processes in patients. All of them are processed as a bulk of numbers not suitable directly for diagnosing or research purposes. Moreover, which is common in biomedical sciences, measured data are intrinsically inaccurate, i.e., fuzzy. In order to overcome these deficiencies a set of visualization methods has been developed as well as dedicated file formats. In the paper authors discuss selected formats and imaging techniques useful for cardiologists. Problems of medical data processing is outlined. Strengthens and weaknesses of raw STL file format are analyzed. Visualization styles of data fuzziness using experimental package ScPovPlot3D based on POVRay are proposed and discussed.
5:15 PM - 5:30 PMA. Rashkovska, V. Avbelj (Jožef Stefan Institute, Ljubljana, Slovenia)
Abdominal Fetal ECG Measured with Differential ECG Sensor 
Abdominal ECG (AECG) is a non-invasive method for monitoring the cardiac activity of a fetus. A complementary method is the detection of the fetal heart rate with an ultrasound. In this paper, we present and analyze AECG measurements obtained with a differential ECG body sensor. AECG was measured in different months of pregnancy within two subjects: one caring a single fetus and another caring twins. The fetal ECG signal measured on the abdomen during pregnancy is superimposed to the mother’s AECG and has a very small amplitude, which is smaller than the amplitude of the mother’s ECG in that part of her body. The interference from the power grid is not present in the signal, which is crucial for further analysis. The recordings demonstrate the remarkable potential of the sensor for AECG measurements.
5:30 PM - 5:45 PMA. Vilhar, M. Depolli (Jožef Stefan Insitute, Ljubljana, Slovenia)
Synchronization of Time in Wireless ECG Measurement 
Wireless devices for ambulatory ECG monitoring are becoming increasingly popular. One of the challanges that wireless monitoring has to deal with is clock synchronization between the wireless sensor device and the controller device. The monitoring device is usually kept as simple as possible to maximize its authonomy, and is thus limited in its clock accuracy. In this article, we describe a method of off-line synchronization of data from a completely asynchronous sensor device. The sensor device sampling frequency is allowed to deviate from its declared value by more than 100 ppm, while the controlling device time is assumed to be absolutelly accurate. Data synchronization is based on using two sources of timestamps for the data -- the sensor device oscillator and the controller device local time. In our case the controlling device is an Android mobile phone, and the wireless technology used is Bluetooth LE. The challenges are: limited precision of timestamps given by the sensor device, buffering of messages on Android, and lost messages due to lossy transmission mode. The presented synchronization technique is able to achieve less than 0.1 second per day of time drift in spite of the listed challenges.
5:45 PM - 6:00 PMM. Jan (University Medical Centre, Ljubljana, Slovenia), R. Trobec (Institut Jožef Stefan, Ljubljana, Slovenia)
Long-term Follow-up Case Study of Atrial Fibrillation after Treatment 
A case study is presented, based on long-term ECG measurements of a patient with diagnosed persistent atrial fibrillation (PeAF) that undergone the classical diagnostic procedures. The long-term measurements have been performed with an ECG body sensor. Based on the European Heart Rhythm Association (EHRA) guidelines for treatment of atrial fibrillation the left atrial catheter cryo-ablation with an endpoint of pulmonary vein isolation was performed. After the cryo-ablation PeAF still persists, therefore an additional catheter radiofrequency ablation was performed. After the second procedure and in combination with antiarrhythmic drugs the atrial fibrillation (AF) was controlled on the level of relatively rare and short documented AF episodes. A detailed analysis of a long-term measurement has enabled detection of a large spectrum of arrhythmias, which have been documented over a ten-week period of measurements. Those include atrial extrabeats and nonsustained atrial tachycardias that might be the initial triggers for AF. The initial study motivates new hypotheses about the long-term impact of ablation procedures and antiarrhythmic drugs on the outcome of medical therapies, which deserves to be further elucidated with a larger and more systematic study.
6:00 PM - 6:15 PMCoffe break 
6:15 PM - 6:30 PMM. Brložnik (Clinic for small animals and surgery, Veterinary faculty, University of Ljubljana, Ljubljana, Slovenia), V. Avbelj (Department of Communication Systems, Jožef Stefan Institute, Ljubljana, Slovenia)
A Case Report of Long-Term Wireless Electrocardiographic Monitoring in a Dog with Dilated Cardiomyopathy 
Abstract - Wireless electrocardiographic (ECG) sensor attached to the skin and connected to a smart device via low power Bluetooth technology has been used to record more than 500 hours of ECG data in a German shepherd dog with dilated cardiomyopathy (DCM). Wireless ECG monitoring has been used for a period of 6 months. With the wireless body electrodes, the ECG data were obtained while the dog was resting, walking, playing and eating. Atrial fibrillation, ventricular premature complexes, occasional ventricular tachycardia and multiform ventricular beats were observed. Numerous standard 6-lead ECG recordings have been compared to the recordings obtained with wireless body electrodes. Instantaneous and average heart rates and standard duration measurements evaluated with the two devices were identical in all cases. The extended ECG monitoring time with the wireless device increased the diagnostic yield of arrhythmias. The dog was treated with diuretics, positive inotropes, ACE inhibitor and antiarrhythmics for 2 years. Influence of various drugs, dog’s activities, and environmental factors on ECG data was investigated. During the 6 months period dog’s condition was changing substantially and long term ECG monitoring excluded arrhythmias as the cause for dog’s weakness. The wireless device, which proved to be reliable and simple to use, enables an excellent option of long-term monitoring of canine cardiac rhythm in real-world environment.
6:30 PM - 6:45 PMA. Ristovski (Innovation dooel, Skopje, Macedonia), M. Gusev (Ss. Cyril and Methodius University, Skopje, Macedonia)
SaaS Solution for ECG Monitoring Expert System 
The advance in embedded systems and sensor technology has only recently unlocked the potential of wearable ECG monitoring expert systems. So far, the ECG monitoring software solutions that deal with the processing of harnessed ECG data consist of straight-forward desktop applications. Although there are a number of proposed telecardiology concepts, none of them is capable of completely replacing the extensiveness of the Holter monitoring services in the form of a SaaS application. The SaaS infrastructure presented is a fully functional commercial service intended to replace the standard desktop setup in a scenario where the ECG sensor is capable of transmitting the data to the cloud.
6:45 PM - 7:00 PMD. Kaya, M. Turk, T. Kaya (Firat University , Elazig, Turkey)
Wavelet-Based Analysis Method for Heart Rate Detection of ECG Signal Using LabVIEW 
LabVIEW is a graphical programming language that uses a dataflow model instead of sequential lines of text code. LabVIEW allows multiple operations to work in parallel. So, designers spend less time than a text based programming language. Application areas such as signal processing, image processing and data analysis are available. In this paper, wavelet analysis is used for the elimination of undesired frequency noise. In the obtained noise-free signal, the accurate heart rate was determined with helping of the program developed in the LabVIEW environment. The performance of the system was tested with different wavelet types and satisfactory results were obtained.
7:00 PM - 7:15 PME. Domazet, M. Gusev (University Sts Cyril and Methodius, Skopje, Macedonia)
Parallelization of Digital Wavelet Transformation of ECG Signals 
The advances in electronics and ICT industry for biomedical use have initiated a lot of new possibilities. However, these IoT solutions face the big data challenge where data comes with a certain velocity and huge quantities. In this paper, we analyze a situation where wearable ECG sensors stream continuous data to the servers. A server needs to receive these streams from a lot of sensors and needs to star various digital signal processing techniques initiating huge processing demands. We present a new parallel OpenMP solution for processing of ECG signals and present experimental results for various applications of digital wavelet transformations. The analysis shows speedup that can cope with increasing processing demands of these cloud servers.
7:15 PM - 7:30 PMI. Čuljak, M. Cifrek (University of Zagreb, Faculty of Electrical Engineering and Computing, Zagreb, Croatia)
Hilbert Transform Based Paroxysmal Tachycardia Detection Algorithm 
Paroxysmal tachycardia (supraventricular and ventricular) is an episodic condition with an abrupt onset and termination followed by a rapid heart rate, usually between 140 and 250 beats per minute. Paroxysmal tachycardia can be discovered by detecting a QRS complex in ECG signals. Supraventricular tachycardia is characterized by a narrow QRS complex, and on the other side, ventricular tachycardia is characterized by a broad QRS complex. Detecting paroxysmal tachycardia can prevent pathogenesis of a heart disease. Implementation of an algorithm for QRS detection based on properties of the Hilbert transform is proposed in this paper. The results of the algorithm were compared with the Pan-Tompkins algorithm and the detection efficiency of the implemented algorithm on the used signals was 97.5 %. Both algorithms were tested using the recordings from the MIT-BIH Arrhythmia database.
7:30 PM - 7:45 PMA. Jovic, D. Kukolja, K. Friganovic ( Faculty of Electrical Engineering and Computing, Zagreb, Croatia), K. Jozic (INA-Industrija nafte, d.d., Zagreb, Croatia), S. Car ( Dom zdravlja Dubrovnik, Babino polje, Croatia)
Biomedical Time Series Preprocessing and Expert-System Based Feature Extraction in MULTISAB Platform 
In this paper, we review the current state of implementation of the MULTISAB platform, a web platform whose main goal is to provide a user with detailed analysis capabilities for heterogeneous biomedical time series. These time series are often encumbered by noise that prohibits accurate calculation of clinically significant features. The goal of preprocessing is either to completely remove the noise or at least to ameliorate the quality of the recorded series. The focus of this paper is on the description of an expert feature recommendation system for electrocardiogram analysis. We demonstrate the process through which one arrives at a point where significant expert features are proposed to a platform user, based on time series at hand, analysis goal, and available length of the time series. We also provide the description of implemented preprocessing techniques and feature extraction procedure within the platform.
7:45 PM - 8:00 PMM. Makovec (Novo mesto General Hospital, Novo mesto, Slovenia), U. Aljančič, D. Vrtačnik (University of Ljubljana, Faculty of Electrical Engineering, Ljubljana, Slovenia)
Evaluation of Chronic Venous Insufficiency with PPG Prototype Instrument 
Photoplethysmography (PPG) is a non-invasive optical technique for measurement of blood volume changes inside an organ or body part. In presented study, a chronic venous insufficiency (CVI) of 25 individuals (38 legs) was measured using PPG prototype instruments. The results of PPG evaluation were compared to results of Doppler ultrasound, which is the gold standard for diagnosis of CVI. The sensitivity and specificity of PPG prototype instrument in amount of 82% and 81%, respectively was calculated.
8:00 PM - 8:15 PMZ. Juhasz (University of Pannonia, Veszprem, Hungary)
Highly Parallel Online Biomedical Signal Processing on GPU Architecture 
Signal processing is of central importance in biomedical systems, in which pre-processing steps are unavoidable in order to reduce noise, remove unwanted artefacts, segment time series into smaller epochs, or extract statistical and other descriptive features that can be used in consecutive classification stages. The high sampling frequencies and electrode counts used in advanced EEG or body-surface potential mapping ECG systems result in very large data sets, which require considerable time to process. While long computation times can be tolerated in research settings, in many application areas, e.g. brain computer interfaces, online feature detection (e.g. in epilepsy or heart monitoring) or clinical diagnostics, fast – often real-time – execution speed is required. Traditional, multi-core CPUs provide only a partial solution to this problem as the small number of cores severely limits the achievable speedup and scalability of the algorithms. GPUs, on the other hand, have tremendous ‘horse power’ with several teraflops of raw computational power and thousands of cores per cards, even in embedded versions. It seems natural to use these devices for the demanding signal EEG and ECG processing tasks. This paper illustrates that GPU implementation strategies of parallel signal processing algorithms and their performance can vary widely and the highest performance is achievable only if great attention is given to the hardware execution characteristics of the GPU. Overlapped data transfer and computation, optimized use of the GPU memory hierarchy, using enough threads to hide memory transactions, executing multiple independent instructions per thread to take advantage of the internal instruction pipeline are some of the techniques that affect the final performance of our algorithms. A set of representative algorithms (time and frequency domain FIR filters, correlation, power spectrum, wavelet transformation, statistical features) is implemented in parallel and executed in a pipeline fashion first on a multi-core CPU to provide baseline performance figures. Then the performance of GPU version of the algorithms is examined in terms of the above optimization techniques and compared with the CPU results. It is shown that with in-depth profiling and performance analysis, using an incremental development strategy, very efficient single processing algorithm implementations can be created. The contribution of the paper is the analysis of performance critical factors and a demonstration of their effect on the performance of the algorithms. While the emphasis is on EEG and ECG algorithms, the techniques and results shown can be of benefit for other signal processing application areas as well.

Basic information:
Chairs:

Karolj Skala (Croatia), Roman Trobec (Slovenia), Uroš Stanič (Slovenia)

Steering Committee:

Enis Afgan (Croatia), Almir Badnjevic (Bosnia and Herzegovina), Piotr Bala (Poland), Leo Budin (Croatia), Jelena Čubrić (Croatia), Borut Geršak (Slovenia), Simeon Grazio (Croatia), Gordan Gulan (Croatia), Ladislav Hluchy (Slovakia), Željko Jeričević (Croatia), Peter Kacsuk (Hungary), Aneta Karaivanova (Bulgaria), Zalika Klemenc-Ketiš (Slovenia), Miklos Kozlovszky (Hungary), Charles Loomis (France), Ludek Matyska (Czech Republic), Željka Mihajlović (Croatia), Damijan Miklavčič (Slovenia), Tonka Poplas Susič (Slovenia), Laszlo Szirmay-Kalos (Hungary), Tibor Vámos (Hungary), Matjaž Veselko (Slovenia), Yingwei Wang (Canada)

Registration / Fees:
REGISTRATION / FEES
Price in EUR
Before 8 May 2017
After 8 May 2017
Members of MIPRO and IEEE 180 200
Students (undergraduate and graduate), primary and secondary school teachers 100 110
Others 200 220


The discount doesn't apply to PhD students.


Contact:

Jelena Cubric
Rudjer Boskovic Institute
Bijenicka 54
HR-10000 Zagreb, Croatia

E-mail: dc-vis@mipro.hr

DC VIS PPT presentation repository

PPT presentations can be uploaded at:  http://dc-vis.irb.hr 

The best papers will get a special award.
Accepted papers will be published in the ISBN registered conference proceedings. Papers presented at the Conference will be submitted for posting to IEEE Xplore.
Authors of outstanding papers will be invited to submit the extended version of their papers to a special issue of Scalable Computing: Practice and Experience (ISSN 1895-1767) published in the first quarter of 2018.

 

International Program Committee General Chair:

Petar Biljanović (Croatia)

International Program Committee:

Slavko Amon (Slovenia), Vesna Anđelić (Croatia), Michael E. Auer (Austria), Dubravko Babić (Croatia), Snježana Babić (Croatia), Almir Badnjevic (Bosnia and Herzegovina), Mirta Baranović (Croatia), Bartosz Bebel (Poland), Ladjel Bellatreche (France), Eugen Brenner (Austria), Gianpiero Brunetti (Italy), Andrea Budin (Croatia), Željko Butković (Croatia), Željka Car (Croatia), Matjaž Colnarič (Slovenia), Alfredo Cuzzocrea (Italy), Marina Čičin-Šain (Croatia), Marko Čupić (Croatia), Marko Delimar (Croatia), Todd Eavis (Canada), Maurizio Ferrari (Italy), Bekim Fetaji (Macedonia), Renato Filjar (Croatia), Tihana Galinac Grbac (Croatia), Paolo Garza (Italy), Liljana Gavrilovska (Macedonia), Matteo Golfarelli (Italy), Stjepan Golubić (Croatia), Francesco Gregoretti (Italy), Stjepan Groš (Croatia), Niko Guid (Slovenia), Jaak Henno (Estonia), Ladislav Hluchy (Slovakia), Vlasta Hudek (Croatia), Željko Hutinski (Croatia), Mile Ivanda (Croatia), Hannu Jaakkola (Finland), Leonardo Jelenković (Croatia), Dragan Jevtić (Croatia), Robert Jones (Switzerland), Peter Kacsuk (Hungary), Aneta Karaivanova (Bulgaria), Marko Koričić (Croatia), Tomislav Kosanović (Croatia), Mladen Mauher (Croatia), Igor Mekjavic (Slovenia), Branko Mikac (Croatia), Veljko Milutinović (Serbia), Nikola Mišković (Croatia), Vladimir Mrvoš (Croatia), Jadranko F. Novak (Croatia), Jesus Pardillo (Spain), Nikola Pavešić (Slovenia), Vladimir Peršić (Croatia), Slobodan Ribarić (Croatia), Janez Rozman (Slovenia), Karolj Skala (Croatia), Ivanka Sluganović (Croatia), Mario Spremić (Croatia), Vlado Sruk (Croatia), Stefano Stafisso (Italy), Uroš Stanič (Slovenia), Ninoslav Stojadinović (Serbia), Mateo Stupičić (Croatia), Jadranka Šunde (Australia), Aleksandar Szabo (Croatia), Laszlo Szirmay-Kalos (Hungary), Dina Šimunić (Croatia), Zoran Šimunić (Croatia), Dejan Škvorc (Croatia), Antonio Teixeira (Portugal), Edvard Tijan (Croatia), A Min Tjoa (Austria), Roman Trobec (Slovenia), Sergio Uran (Croatia), Tibor Vámos (Hungary), Mladen Varga (Croatia), Marijana Vidas-Bubanja (Serbia), Mihaela Vranić (Croatia), Boris Vrdoljak (Croatia), Damjan Zazula (Slovenia)

Location:

Opatija, with its 170 years long tourist tradition, is the leading seaside resort of the Eastern Adriatic and one of the most famous tourist destinations on the Mediterranean. With its aristocratic architecture and style Opatija has been attracting renowned artists, politicians, kings, scientists, sportsmen as well as business people, bankers, managers for more than 170 years.

The tourist offering of Opatija includes a vast number of hotels, excellent restaurants, entertainment venues, art festivals, superb modern and classical music concerts, beaches and swimming pools and is able to provide the perfect response to all demands.

Opatija, the Queen of the Adriatic, is also one of the most prominent congress cities on the Mediterranean, particularly important for its international ICT conventions MIPRO that have been held in Opatija since 1979 gathering more than a thousand participants from more than forty countries. These conventions promote Opatija as the most desirable technological, business, educational and scientific center in Southeast Europe and the European Union in general.


For more details please look at www.opatija.hr/ and www.visitopatija.com

Download
 
News about event
Currently there are no news
 
Patrons - random
Sveučilište u ZagrebuSveučilište u RijeciFER ZagrebPomorski fakultet RijekaTehnički fakultet Rijeka