|
Presented papers will be submitted for inclusion in the IEEE Xplore Digital Library.
Invited Lecture
|
Maria Lindén (Mälardalen University, Sweden)
Embedded Sensor Systems for Health Plus – Enabling Health Trend Monitoring Anytime, Anywhere |
Biomedical Engineering
Chairs: Roman Trobec and Uroš Stanič |
Papers |
N. Petrovic, I. Tomasic, M. Lindén, P. Risman (Mälardalens högskola, Vasteras, Sweden) Detection of Human Bodypart Abnormalities by Microwaves – A New Approach
Microwaves offer possibilities of non-invasive detection of abnormalities such as brain haemorrhages and breast tumours. Microwave imaging was introduced in the early 80’s; the object was submerged in a microwave-absorbing liquid – bolus – for avoiding interferences as well as reducing direct coupling between antennas. While X-rays provides contrast by the atomic behaviours and thus an optical-like see-through capability, and ultrasound provides “elasticity” reflections back to the transmitter from dif-ferent tissue interfaces, microwaves behave very differently: direct see-trough and reflections do not work. Instead, diffraction phenomena appear at and around objects with different dielectric properties which are mainly determined by the water and ionic content, and the object size. Since the contrast mechanisms of the three principles are very different and often quite weak, which makes the quality of examinations very operator-dependent, the methods are often complementary if equipments are not very expensive, as is the case with the “almost perfect” magnetic resonance tomography method.
Microwave equipments are not yet significant in the market. Reasons are quite many: properly adapted antenna applicators are not at all related to communication antenna technologies, the signal attenuation is typically quite large so that the instrumentation becomes complex, and disturbing surface waves occur at the outer boundary of the head and breast.
A way of eliminating the surface waves is to immerse the bodypart in a so-called bolus liquid with simi-lar dielectric properties to the body tissues, but that causes discomfort and logistics problems, as well as a much stronger signal attenuation than with directly contacting antenna applicators. Our group at MDH is therefore focussing on new kinds of antenna applicators and studies of how the diffraction phenomena can be utilised maximally towards direct detection of the inhomogeneities. Our R&D work can be said to be in three technically independent areas: transmitting applicators in free air not needing direct con-tact to the bodypart and not emitting surfaces waves, movable receiving applicators, and signal handling methods for assisting the operator in the search for and position determination of any internal abnormal-ities. Our goal is to provide a rather low-cost equipment that will be very easy to use and not cause any risks as with X-rays, so it can be used as a first examination and be complemented with e.g. X-ray mammography for achieving a very low rate of false positives/negatives.
|
I. Tomasic, N. Petrovic, M. Lindén (Mälardalen Univeristy, Västerås, Sweden), A. Rashkovska (Jozef Stefan Institute, Ljubljana, Slovenia) Comparison of Publicly Available Beat Detection Algorithms Performances on the ECGs Obtained by a Patch ECG Device
Eight ECG beat detection algorithms, from the PhysioNet’s WFDB and Cardiovascular Signal toolboxes, were tested on twenty measurements, obtained by the Savvy patch ECG device, for their accuracy in beat detection. On each subject, one measurement is obtained while sitting and one while running. Each measurement lasted from thirty seconds to one minute. The measurements obtained while running were more challenging for all the algorithms, as most of them almost perfectly detected all the beats on the measurements obtained in sitting position. However, when applied on the measurements obtained while running, all the algorithms have performed with decreased accuracy. Considering overall percentage of the faulty detected peaks, the four best algorithms were jqrs, from the Cardiovascular Signal Toolbox, and ecgpuwave, gqrs, and wqrs, from the WFDB Toolbox, with percentages of faulty detected beats 1.7, 2.3, 2.9, and 3, respectively.
|
M. Brložnik, A. Domanjko Petrič (Small Animal Clinic, Veterinary Faculty, University of Ljubljana, Ljubljana, Slovenia), V. Kadunc Kos (Clinic for Reproduction and Large Animals, Veterinary Faculty, University of Ljubljana, Ljubljana, Slovenia), A. Rashkovska, V. Avbelj (Department of Communication Systems, Jožef Stefan Institute, Ljubljana, Slovenia) Wireless Body Sensor for Electrocardiographic Monitoring in Equine Medicine
Wireless body sensor, connected via Bluetooth technology to a smart device, was used to obtain ECG data in seven horses while they were standing, walking and trotting. Different positions of electrodes were tested. Measurements with the electrode distance of 8 cm, as proposed in the wireless sensor, gave reliable results only in standing horses, because the voltage of the recorded waves was low and easily superseded by motion artifacts. To increase the voltage of the recorded waves, additional extension wires had to be used. To avoid movement artifacts, the most appropriate placement of electrodes was determined. The positive electrode was placed on the left chest side near the caudal cardiac border and the negative electrode was placed on the left side of the withers. In these recordings, QRS complexes could be followed during trotting, while P waves were buried in the irregular baseline caused by the motion. Wireless ECG data were compared to simultaneously recorded standard ECG in all seven horses, and equivalent results were obtained for heart rate, cardiac rhythm and duration of different waves.
|
D. Dojchinovski, A. Ilievski, M. Gusev (FCSE, Skopje, Macedonia) Interactive Home Healthcare System with Integrated Voice Assistant
Persistent technological developments in healthcare have saved countless lives and it continues to improve quality of life over time. It’s also had a huge impact on medical processes and the practices of healthcare professionals. Both caregivers and patients can monitor and analyze health issues and communicate with ease with the help of technology. Communication is the basis on which any good relationship stands, and this is especially accurate for the provider-patient relationship. More healthcare organizations are using voice-controlled platforms to make hands-free communication easier. Voice assistants like Alexa and Google Home have the potential to revolutionize healthcare by increasing efficiency and allowing for easier access to medical information. This paper explores the customer-focused and innovative services developed with the Alexa and Google Home devices and pinpoints the benefits of this technology in healthcare.
|
V. Zvoncak, J. Mekyska (Department of Telecommunications and SIX Research Centre, Brno University of Technology, Brno, Czech Republic), K. Safarova ( Department of Psychology, Faculty of Arts, Masaryk University, Brno, Czech Republic), Z. Smekal (Department of Telecommunications and SIX Research Centre, Brno University of Technology, Brno, Czech Republic), P. Brezany (Faculty of Computer Science, University of Vienna, Vienna, Austria) New Approach of Dysgraphic Handwriting Analysis Based on the Tunable Q-Factor Wavelet Transform
Developmental dysgraphia is a neurodevelopmental disorder present in up to 30% of elementary school pupils. Since it is associated with handwriting difficulties (HD), it has detrimental impact on children’s academic progress, emotional well-being, attitude and behaviour. Nowadays, researchers proposed a new approach of HD assessment utilizing digitizing tablets. I.e. that handwriting of children is quantified by a set of conventional parameters, such as velocity, duration of handwriting, tilt, etc. The aim of this study is to explore a potential of newly designed online handwriting features based on the tunable Q-factor wavelet transform (TQWT) in terms of computerized HD identification. Using a digitizing tablet, we recorded a written paragraph of 97 children who were also assessed by the Handwriting Proficiency Screening Questionnaire for Children (HPSQ–C). We evaluated discrimination power (binary classification) of all parameters using random forest and support vector machine classifiers in combination with sequential floating forward feature selection. Based on the experimental results we observed that the newly designed features outperformed the conventional ones (accuracy = 79.16 %, sensitivity = 86.22 %, specificity = 73.32 %). When considering the combination of all parameters (including the conventional ones) we reached 84.66% classification accuracy (sensitivity = 88.70 %, specificity = 82.53 %). The most discriminative parameters were based on vertical movement and pressure, which suggests that children with HD were not able to maintain stable force on pen tip and that their vertical movement is less fluent. The new features we introduced go beyond the state-of-the-art and improve discrimination power of the conventional parameters by approximately 20.0 %.
|
S. Camarasu-Pop, C. Lartizien, P. Wassong, A. Bonnet, T. Grenier (CNRS - Creatis, Villeurbanne, France), V. Hamar, F. Hernandez (CC-IN2P3, Villeurbanne, France), L. Arrabito, J. Bregeon ( Université de Montpellier, Montpellier, France), P. Gay (Université de Bordeaux, Bordeaux, France), A. Tsaregorodtsev (Centre de Physique des Particules de Marseille, Marseille, France) Exploiting GPUs on Distributed Infrastructures for Medical Imaging Applications with VIP and DIRAC
GPU usage has become mandatory for the processing of (3D) medical data, as well as for efficient machine learning approaches such as deep learning. In this contribution, we present how VIP and DIRAC can be leveraged to run medical image processing applications on distributed computing resources equipped with GPUs.
VIP (Virtual Imaging Platform) is a web portal (https://vip.creatis.insa-lyon.fr) for the simulation and processing of massive data in medical imaging. VIP users can access applications as a service and important amounts of computing resources and storage with no required technical skills beyond the use of a web browser. VIP relies on the DIRAC (Distributed Infrastructure with Remote Agent Control) interware for scheduling tasks for execution on distributed infrastructures such as grid, clouds, and local clusters. New applications are regularly integrated into VIP/DIRAC. They all have their own requirements, among which GPU usage is more and more frequent.
This contribution will give (i) an overview of the targeted medical applications and their requirements and (ii) technical insights on how VIP and DIRAC allow end users to efficiently exploit GPU resources with no specific knowledge about the underlying distributed infrastructure.
|
E. Ajdaraga, M. Gusev (Faculty of Computer Science and Engineering, University of Ss. Cyril and Methodius, Skopje, Macedonia) Analysis of a Differential Noise Detection Filter in ECG Signals
Noise detection presents a big challenge in wearable heart monitor technology and beat detection. The distorted signal is hard to interpret and as a consequence, valuable information may be lost. In this paper we present our research for developing a new noise detection filter based on the differential filter. Our analysis offers in-depth evaluation of the optimal values of two critical parameters - window size and Signal-to-Noise threshold. Our final goal is to minimize QRS detection and beat classification errors which are caused by noise-distorted data.
|
R. Trobec (Jožef Stefan Institute, Ljubljana, Slovenia), M. Jan (University Medical Centre, Ljubljana, Slovenia), M. Lindén, I. Tomašić (Mälardalen Univeristy, Västerås, Sweden) Detection and Treatment of Atrial Irregular Rhythm with Body Gadgets and 35-channel ECG
Atrial irregular rhythm, often reflected in atrial fibrillation or undulation, is recognized as one of the major causes of brain stroke because it increases the likelihood of blood clots formation. Its early detection, which is possible with long-term measurements with body ECG sensors, is becoming an increasingly important preventive measure. Treatment of atrial fibrillation is possible with cardiac rhythm conversion, atrial catheter ablation, or with antiarrhythmic drugs. The ablation seems to be most promising, but unfortunately it is not as successful as one would like. The paper presents a methodology for the detection of atrial fibrillation an ECG body sensor. Subsequent non-invasive 35-channel ECG measurements are used for a preliminary analysis of atrial arrhythmia. The obtained results suggest that the described methodology could be useful in all aspects of treatment of atrial irregular rhythm. One can obtain a reliable information on the incidence and duration of fibrillation events, can analyze and determine arrhythmic focuses and conductive pathways in heart atria, and finally, can study the effect of antiarrhythmic drugs on existing arrhythmias and on eventual developments of new types of arrhythmias.
|
I. Grubišić, D. Davidović, K. Skala (Ruđer Bošković Institute, Zagreb, Croatia), M. Depolli, M. Mohorčič, R. Trobec (Jožef Stefan Institute, Ljubljana, Slovenia) Enriching Heart Monitoring with Accelerometer Data
Due to the long term measurement monitoring data, it is of critical importance to save time for diagnosis by the physicians. In this paper, we present our research in enhancing an existing ECG signal from a portable ECG device with accelerometer data. The research aims to extract and recognise the patient pose and activity and to annotate ECG signal in time scale. By introducing this new information about patient activity, which will add new information besides an ECG record, the physician would have information about whether the change in heart frequency is caused by the heart and when due to the patient movement activity.
|
Papers |
P. Pereira (Instituto de Telecomunicações, Leiria, Portugal), R. Fonseca-Pinto (Polytechnic Institute of Leiria, Leiria, Portugal), . Paiva (DEI - FCTUC, University of Coimbra,, Coimbra, Portugal), L. Távora (Polytechnic Institute of Leiria, Leiria, Portugal), P. Assunção, S. Faria (Instituto de Telecomunicações, Leiria, Portugal) Accurate Segmentation of Dermoscopic Images Based on Local Binary Pattern Clustering
Segmentation is a key stage in dermoscopic image processing, where the accuracy of the border line that defines skin lesions is of utmost importance for subsequent algorithms (e.g., classification) and computer-aided early diagnosis of serious medical conditions.
This paper proposes a novel segmentation method based on Local Binary Patterns (LBP), where LBP and K-Means clustering are combined to achieve a detailed delineation in dermoscopic images. In comparison with usual dermatologist-like segmentation (i.e., the available ground truth), the proposed method is capable of finding more realistic borders of skin lesions, i.e., with much more detail. The results also exhibit reduced variability amongst different performance measures and they are consistent across different images.
The proposed method can be applied for cell-based like segmentation adapted to the lesion border growing specificities. Hence, the method is suitable to follow the growth dynamics associated with the lesion border geometry in skin melanocytic images.
|
J. Bogatinovski, D. Kocev, A. Rashkovska (Jožef Stefan Institute, Ljubljana, Slovenia) Feature Extraction for Heartbeat Classification in Single-lead ECG
The recent trends in ECG device development are heading towards wireless leadless differential ECG sensors. The lightweight design of these wireless sensors allows the patients to wear it comfortably for a long period of time and during their ordinary everyday activities. Long-term ECG recordings are intended to help in detection or diagnosis of heart diseases. These measurements are significantly longer and heterogeneous than the measurements performed at a controlled hospital environment. Consequently, their manual inspection is a tedious, hard and expensive job. An alternative is to use computational techniques for automatic classification of heartbeats and arrhythmia detection. In this paper, we propose methods for feature extraction in single-lead ECG for the purpose of heartbeat classification. The used feature extraction methods originate from the field of time series analysis. The obtained features are then coupled with a classification algorithm to obtain predictive models. The usefulness of the proposed approach is demonstrated on the MIT-BIH arrhythmia database and on a dataset from single-lead ECGs.
|
L. Bento, F. Cunha (Instituto de Telecomunicações, Leiria, Portugal), L. Távora ( Polytechnic Institute of Leiria, Leiria, Portugal), P. Assunção, S. Faria (Instituto de Telecomunicações, Leiria, Portugal), R. Fonseca-Pinto (ciTechCare - Center for Innovative Care and Health Technology, Leiria, Portugal) A Methodology for Laser Speckle Simulation in Controlled Dynamic Processes
The use of coherent light in imaging is the basis of several technologies that extends from nanotechnology, biomedicine, structural biology to metrology. Regarding laser speckle imaging in health-related conditions, in recent years several applications have been developed, emphasizing the potential of laser speckle as a functional imaging methodology, either used as individual methodology or in a multimodal imaging scheme. To assess distinct acquisition methodologies, several experimental protocols have been tested, but also new activity descriptors have been developed. These image processing derived descriptors are core to the speckle characterization in dynamic physiological conditions. Accordingly, the use of computer simulation algorithms to obtain the phenomena, avoiding acquisition noise is a research topic with great interest among the community, as a way to test the descriptors performance in a controlled way. In this work, a methodology for laser speckle simulation of dynamic processes is presented. The proposed algorithm allows controlling the way the process varies by setting a linear, quadratic, sinusoidal or mixt behaviors, during the simulation period.
|
L. Borozan, D. Matijević (Department of mathematics, University of Osijek, Osijek, Croatia), S. Canzar (Gene Center Munich - LMU Munich, Munich, Germany) Properties of the Generalized Robinson-Foulds Metric
Comparing arboreal structures is a problem with many applications in various fields of biology. In this paper we have mainly focused on phylogenetic trees that are compared in order to quantify the similarities between different biological processes (e.g. phylogenetic models of tumor progression and metastasis). The common measure of the similarity between two arboreal structures is the Robinson Foulds (RF) metric. In particular, the generalized RF metric corrects some of its flaws while retaining its widely appreciated properties. We have conducted thorough experimental analysis on real world and simulated data for symmetric difference and Jaccard weight dissimilarity measures. Our main aim was to deepen the understanding of the properties of generalized RF metric and the impact of dissimilarity measures on the computed distance. For the purpose of the computation we are using state-of-the-art integer branch-and-cut solver, Trajan (available at https://github.com/canzarlab/trajan).
|
E. Ajdaraga, M. Gusev (Faculty of Computer Science and Engineering, University of Ss. Cyril and Methodius, Skopje, Macedonia), L. Poposka (University Clinic of Cardiology, University of Ss. Cyril and Methodius, Skopje, Macedonia) Evaluation of User Satisfaction with the ECGalert System
This paper offers an insight on user satisfaction with ECGalert, a heart monitoring service based on a wearable device.
We report results from a survey that gathered the opinions of both the patients and doctors, as well as the evaluation of user experience on a random sample.
In addition, several other events were organized to include random sample of general population, including screenings on three different events: a world heart day, marathon or stair climbing.
The results show that the majority of respondents have a positive attitude towards telemedicine systems. A very interesting fact is that even the older population is keen on change of the attitude towards a new and improved technology, mostly due to the increased comfort and new offered possibilities for real-time monitoring.
|
M. Gusev, M. Boshkovska (University Sts Cyril and Methodius, Skopje, Macedonia) Performance Evaluation of Atrial Fibrillation Detection
Atrial Fibrillation (AFib) is one of the most common cardiovascular diseases, and serious consequences can occur if they are not treated on time.
Although AFib is defined as irregular rhythm with the absence of P waves in the ECG signal, still the algorithms for AFib detection are not perfect. When analyzing the state-of-the-art literature, we found several problems in identification consistent methodology that will allow comparison of different methods. The inconsistency problems are found in defining the set of benchmark tests and used key performance indicators.
For example, many authors use different datasets, analyzing, one or more different reference databases, including a selected set of records, training the algorithms on a particular set and then testing it on the remainder of the set.
From the other side, some have used accuracy, or specificity measures, the others sensitivity only without positive predictivity rate.
And some authors show that their algorithms reach high sensitivity values (over 95\%), although the achieved positive predictive rate is low.
In this paper, we present three different methodologies and argue which key performance indicators should reveal the best performance evaluation. As a conclusion, we present that choosing the F1 score, also called F score or F Measure. Put another way, the F1 score conveys the balance between the positive predictive value (also called precision) and sensitivity (also called recall).
Also, we found that the way the true and false positives and true and false negatives are determined varies from one approach to another and argue on which approach is the most relevant.
|
J. Marić, M. Šikić (Faculty of Electrical Engineering and Computing, Zagreb, Croatia) Approaches to Metagenomic Classification and Assembly
Microbiome is an ecological community of commensal, symbiotic, and pathogenic microorganisms that share the same environment. The study of microbiome, i.e. genetic material sampled directly from environmental samples is called metagenomics. In recent years methods of genome sequencing have dramatically improved and the number and variety of sequenced genomes has rapidly increased. New technology has significantly increased the variety and complexity of the microbiome research and ever-larger datasets present new challenges in analysis of metagenomic data.
Two main tasks of metagenomic analysis are classification of sequenced metagenomic data into taxa and assembly of the data into longer contiguous sequences. The final aim of both tasks is to correctly identify species presented in the metagenomic sample. This has various applications in medicine (infectious disease diagnosis), development of biofuels, biotechnology, agriculture, and many other areas.
In this paper, we present a description of common procedures and methods for metagenomic data analysis and the challenges facing these procedures. We give an overview of existing software tools and a review of public genome databases used in metagenomic analysis. Finally, we explore possible improvements to the existing methods for metagenomic classification and assembly and propose improvements in genome database management.
|
U. Marhl (Institute of Mathematics, Physics and Mechanics/Department of Physics, Ljubljana, Slovenia), A. Jodko-Wladzinska (Physikalisch-Technische Bundesanstalt, Warsaw, Poland), R. Brühl, T. H. Sander (Physikalisch-Technische Bundesanstalt, Berlin, Germany), V. Jazbinšek (Institute of Mathematics, Physics and Mechanics/Department of Physics, Ljubljana, Slovenia) Application of Source Localization Algorithms in Magnetoencephalography: Test on a New Generation of Magnetometers
Magnetoencephalography (MEG) is a noninvasive neuroimaging technique for measuring the activity in the brain. In the vicinity of the head it measures the magnetic field produced by the electric currents in neurons. From the measured magnetic field, using various computational techniques, we determine the location of the source. In this paper, we present the main methods for localizing activated areas in the brain with the data obtained from a 128-channel superconducting quantum interference (SQUID) system. Great emphasis is on solving the inverse problem, i.e. finding the source from the measured magnetic field around the head, with the minimization of the forward model for calculating the magnetic field inside a conducting sphere. We also demonstrate each step in the image processing of the magnetic resonance imaging (MRI), which we use for a more precise source localization. We address all the major drawbacks when using the SQUID gradiometers. As an alternative, we present the new generation of magnetometers, optically pumped magnetometers (OPM), which operate at room temperature. Compared to SQUIDs they can be placed closer to the head with the use of a custom 3D printed sensor holder. We also present the first test results with a custom system of 15 OPMs.
|
L. Šajn (Faculty of Computer and Information Science, University of Ljubljana, Ljubljana, Slovenia) Detecting White Spot Lesions Caused by Teeth Alignment Treatment
When permanent orthodontic braces are removed dentists typically encounter the initial phase of tooth demineralization manifesting as white spot lesions on the teeth s smooth surfaces. We developed a prototype for automatic segmentation of teeth and white spot lesions, which may contribute to a more accurate and objective way of treatment monitoring. In the process of development, we used various image processing techniques and image segmentation paradigms. The developed prototype was evaluated on our own image database, built from the selected images of clinical examinations since an open annotated database is difficult to obtain. The prototype showed promising results with a lot of potential for improvements in future work.
|
Z. Juhasz, M. Issa (University of Pannonia, Veszprem, Hungary) EEG Based Imaging of Stroke Location, Extent and Progress of Recovery Using a GPU Architecture
The success of stroke treatment and rehabilitation is largely dependent on the efficiency of treatment in the first critical period of stroke. There is an increasing interest in using EEG as a supplementary method for monitoring treatment efficiency and potential derangements of brain function. Various quantitative EEG measures, notably the Brain Symmetry Index and the Delta-Alpha Ratio have been recommended for characterising the status of the stroke. While these methods use low-density clinical EEGs, in this exploratory study, we investigated the benefit of high-resolution EEG as an imaging device for visualising the location and extent of the stroke.
Resting state EEG data recorded with a 128-channel Biosemi ActiveTwo device in eyes closed condition. After pre-processing and artefact removal, the power spectral density was calculated for each channel, from which the Brain Symmetry Index (left-right hemisphere asymmetry measure) and the Delta-Alpha Ratio (ratio of power in the delta /1-4Hz/ and alpha /8-13Hz/ bands) were computed. In addition to computing the global index values, the individual channel values were used to create 2D power and symmetry maps, which clearly show the location of stroke as high intensity map areas. Recovery progress is determined by comparing these maps to ones measured 2 months after the stroke incident. The level of decrease or elimination of the stroke areas can be quantified to indicate success of treatment.
A high-performance parallel GPU implementation is developed that is capable of computing the indexes and maps in milliseconds, making the method usable in various real-time clinical application settings, e.g. continuous stroke monitoring, rehabilitation follow-ups, carotid surgery, etc. Further clinical experiments are planned for statistical validation, and for evaluating the benefits of this new method in clinical settings.
|
D. Tomić (Ruđer Bošković Institute, Zagreb, Croatia), B. Pirkić (Faculty of Veterinary Medicine, Zagreb, Croatia), K. Skala (Ruđer Bošković Institute, Zagreb, Croatia), L. Kranjčević (Faculty of Engineering, Rijeka, Croatia) Predicting the Effectiveness of Multi-drug Cancer Therapies
Despite the ongoing development of new targeted cancer drugs, the survival rate of patients with aggressive forms of cancer like lung and pancreatic cancer is still poor. The main reason is the ability of cancer to develop resistance against cancer drugs. One strategy to overcome this resistance is to use cancer therapies with several drugs
administered at the same time. This can increase our chances to kill cancer cells before they develop resistance. In order to investigate the effectiveness of such therapies, we let the in silico model of cancer Vini to calculate the most effective 2-drug therapies against non-small cell lung (NSCLC), small cell lung cancer (SCLC), and pancreatic
cancer. Vini calculated the combination of vinorelbine with paclitaxel as the most effective against NSCLC, the combination of everolimus with doxorubicin as the most
effective against SCLC, and the combination of everolimus with paclitaxel as the most effective against pancreatic cancer. As the existing clinical studies confirm Vini’s
calculations, it is justified to let Vini search for the combined cancer therapies with even more drugs. In order to further increase their effectiveness, the next research step
will be the personalization of such therapies.
|
P. Brezany (University of Vienna, Faculty of Computer Science, Vienna, Austria), M. Janatova (Charles University, Prague, Czech Republic), O. Stepankova (Department of Cybernetics FEL CVUT, Prague, Czech Republic), M. Lenart, M. Edward (University of Vienna, Faculty of Computer Science, Vienna, Austria), M. Uller (Czech Institute of Informatics, Robotics and Cybernetics, Prague, Czech Republic), R. Burget (SIX Research Centre, Brno University of Technology, Brno, Czech Republic) Management of Physiological Data Streams within Brain Disorder Rehabilitation
This paper developes a novel telemedicine solution for rehabilitation of balance disorders based on analysis and visualization of patient physiological data gathered by a set
of sensors during the rehabilitation process. The training and physiological data sensing are based on two home-grown systems called Homebalance and Scope. The kernel functionality is implemented by the data stream management technology provided by the Esper system. Moreover, it supports training at homes integrated together and to specialized rehabilitation centers by the Dew/Cloud technology. A small core of early adopters is currently successfully conducting balance disorder rehabilitation according to methodology relying on the proposed approach.
|
Data Science
Chairs: Davor Davidović and Zorislav Šojat |
Papers |
S. Petrushevski, M. Gusev, V. Zdraveski (Ss. Cyril and Methodius University, Faculty of Computer Science and Engineering, Skopje, Macedonia) Calculating Average Shortest Path Length Using Compute Unified Design Architecture (CUDA)
Large graphs with millions and even billions of vertices are found in many real-life network analysis, the processing of which is challenging. One of the toughest tasks is computing the average shortest path length in a large network, which requires a lot of memory and processing time while calculating many independent paths. Hence, this task becomes a good candidate for parallelizing. The idea of using graphics processing units (GPUs) for general purpose computing is not new, and with recent increases in performances and memory capacity, they make a perfect candidate for working with graphs. We will explore how the Dijkstra's algorithm can be used to calculate the average shortest-path length, and how it can be used in a massively parallelized system and compare the performance gains and drawbacks.
|
G. Benko, Z. Juhasz (University of Pannonia, Veszprem, Hungary) GPU Implementation of the FastICA Algorithm
Independent Component Analysis is a key tool in the EEG artifact removal process. By estimating the original signal sources from a recorded mixture as components, it becomes possible to identify non-cerebral source components, such as muscle noise, eye movement and blinks, as well as ECG contamination, and remove them to reconstructing the original, artifact-free EEG measurement. The disadvantage of the ICA method is its computational cost; it takes several minutes to process even a few-second data segment. This paper shows the implementation and optimisation of the FastICA algorithm on a GPU architecture and compares its execution performance (speedup) and numerical accuracy with standard implementations frequently used in the EEG community. Few representative examples are also shown that illustrate the artifact removal capability of the final implementation.
|
M. Depolli, R. Trobec (Jožef Stefan Institute, Ljubljana, Slovenia) Computational Efficiency of Linear System Construction for MLPG Method on a Multicore Computer
A contact problem is simulated and surface stresses are analyzed in order to determine the conditions for crack initiation due to fretting fatigue. A weak-form Meshless Local Petrov-Galerkin (MLPG) method is used for simulating the displacements in the material, due to the applied external forces. The solution accuracy increases with higher nodal densities under the contact but so does the calculation time. To manage the computation time, parallel programming is used. Experimental setup is built around a 16-core processor with simultaneous multi-threading (SMT) capabilities to test how well the parallelism in modern hardware can be exploited for solving a single task. We analyze computation times of the linear system construction to evaluate the maximum expected efficiency of parallelization.
|
B. Stojanovič, J. Slak, G. Kosec (“Jožef Stefan” Institute, Ljubljana, Slovenia) RBF-FD Solution of Electromagnetic Scattering Problem
In recent years, mesh-free approaches have become a widely used alternative to conventional methods such as Finite Element Methods in solving numerical scattering problems and other PDEs. In this work a local Radial Basis Generated-Finite Differences method is used to investigate the electromagnetic scattering problem of an infinitely long anisotropic circular cylinder. The method proves useful for treating both the complex valued solutions of the problem and the material discontinuity at the junction between the anisotropic cylinder and free space, as well as providing great performance with parallel discretization. The numerical solution is compared to a known analytical solution in terms of accuracy and Radar Cross Section.
|
J. Močnik-Berljavac, J. Slak, G. Kosec ("Jožef Stefan" Institute, Ljubljana, Slovenia) Parallel Simulation of Time-Domain Acoustic Wave Propagation
Simulations of acoustic wave propagation are an important tool for reconstructing structure of Earth’s subsurface. The core of such simulations is an efficient and accurate method of time-domain acoustic wave simulation in an inhomogeneous domain. Radial Basis Functiongenerated Finite Differences (RBF-FD) method is a popular variant of local strong form meshless methods that does not require predefined connections between nodes, making it easier to adapt node distribution to the problem under consideration.
This paper explores RBF-FD as an alternative to traditionally FDM based methods for time-domain wave propagation. It is demonstrated that RBF-FD provides accurate results even in challenging cases, where conventional methods struggle to even obtain a stable solution.
|
Ž. Jeričević (KMS Technologies, Houston, United States) Fitting Sum of Exponentials to Experimental Data: Linearization by Numerical Integration Approximation
A solution to important and difficult problem of resolving the components of multi-exponential decay is presented. Often the techniques are chosen that are noise sensitive and their error poorly understood. A general, numerically robust and fast (real time) solution method has been developed. Noise attenuation and flexibility of the method are analyzed in detail. Its relevance for signal analysis from different relaxation processes, in particular NMR medical imaging is discussed.
Multi-exponential decays are most often results of parallel, independent relaxation processes, decay of a mixture of radionuclides, parallel chemical reactions of the first order, etc. Inverse problem of finding the components with close decay constants in multi-exponential signal is inherently ill-posed because of non-orthogonality of exponential functions. The result is very sensitive to noise and chosen optimization methodology.
The general solution for the problem of separating exponentials based on linear approximation through repeated numerical integration has been developed. The algorithm is based on the least squares method with the possibility of using non-negative constrains. Using linear approximation avoids problem with supplying a good initial guess, nonproductive iterations, and local minima. Algorithm also includes implementation of global method by Knutson.
|
Z. Šojat, K. Skala (Ruđer Bošković Institute, Zagreb, Croatia) The Rainbow: Integrating Computing into the Global Ecosystem
Are we aware of the fact that what we develop today, the world we create – our children will have to live in it. Unintentionally, and with little or no self-awareness, computer science has crept into every aspect of life, or, better to say, almost every aspect of the natural and social environment has crept into the lap of computer science. Somehow much in the world today came to be the responsibility of Computer Science. Unfortunately, the development was so rapid and mostly stochastic that most of us, computer scientists and computer engineers, application, software and computing paradigm inventors and developers, did not even have a chance yet to realise the Responsibility we are taking on. The ecosystems we live in, from the ecosystem of the family to the ecosystem of social groups and groupings, from the ecosystem of the human body itself, to the ecosystem of the globaly interdependent economics, and from the ecosystem of the living beings sharing this world with us, up to the ecosystem of the whole Earth, those ecosystems are more and more being monitored, controlled and even steered by a mirriad of human-produced artifacts, all-pervasively being based on data processing equipment, or, commonly said, computers. It is very obvious that in modern days computer science leads an extremely influential role in all these ecosystems. It is also obvious that for its ideas and actions in those fields computer science has to take responsibility. In the overall “computing” ecosystem there is actually a kind of “evaporation” of information and requests upwards from Nature to Dew to Human to Fog to Cloud, and a “rain” of services and processed information from the Cloud downwards. Therefore the associative name of Rainbow – as an analogy to the sunray spectrum – an ecosystem in which we necessarily include both the “technical” and the “philosophical” aspects, or, in other words, both the Machines and the Humans. A consistent, robust and properly defined Rainbow Ecosystem will offer new possibilities of knowledge development and information usage for a very broad user base, it will enable proper maintenance of essential natural and human-generated ecosystems, and enable huge savings in many areas of effort, as well as providing novel applications and actively respond to changing economy, supply and business needs, while keeping the Global Ecosystem in natural balance. Proper and responsible cooperation between humans, machines, the environment and the nature, with the aim of freedom, security, prosperity and true betterment of life, is the only way forward.
|
J. Matković (JP Elektroprivreda HZ-HB d.d. Mostar, Mostar, Bosnia and Herzegovina), K. Fertalj (Fakultet elektrotehnike i računarstva, Sveučilišta u Zagrebu, Zagreb, Croatia) Visual Modelling of WS-BPEL Processes Using BPMN Standard
WS-BPEL (Web Service Business Process Execution Language) standard focuses on Web Services integration into units called processes which can contain quite complex behaviour, like loops, condition executions, parallel and synchronized behaviour, event-driven behaviour, behaviour for undoing already completed work, termination behaviour, variable manipulation and so on. Since, there are many constructs offered by WS-BPEL standard, their combination in business process design can result in quite complex behaviour which can be hard to design correctly when working directly in WS-BPEL XML syntax. Thus, there is a need for visual notation to visually design WS-BPEL processes and transform them into executable WS-BPEL XML code. Standard, proposed for visual design of WS-BPEL processes in this article, is BPMN (Business Process Modelling Notation) standard which offers a rich set of visual constructs and it is also based on XML notation in the background. The first part of the article focuses on defining pairs of constructs that are mapped between BPMN and WS-BPEL standard while the second part of the article focuses on algorithms of bidirectional translation between paired constructs of the two standards.
|
H. Nuić, Ž. Mihajlović ( University of Zagreb, Faculty of Electrical Engineering and Computing, Zagreb, Croatia) Algorithms for Procedural Generation and Display of Trees
Main goal of this paper is to explore in which situations should some procedural algorithm be used to generate a tree model. Each algorithm is modified so as to be able to generate a tree model whose shape resembles the required model. Space colonization algorithm, algorithm using particle flows and algorithm simulating a Lindenmayer system are compared depending on a time needed to generate a tree with similar complexity. The voxelization procedure of the model and its application in this context is explained. Method for generating a tree mesh on a graphics card using Bézier's curves is presented.
|
G. Oparin, V. Bogdanova, A. Pashinin, S. Gorsky (Matrosov Institute for System Dynamics and Control Theory of Siberian Branch of Russian Academy of , Irkutsk, Russian Federation) Microservice-oriented Approach to Automation of Distributed Scientific Computations
We offer a new multi-agent approach based on microservices for solving computationally complex problems arising in the course of scientific and applied research in some subject areas. The computational experiments specificity of a considered class of problems is stipulated by the exponential increasing runtime due to increasing dimension, multivariate calculations of the different input data of the problem, as well as the variability of a mathematical model and algorithm for solving the problem. The use of microservices provides reusability, ease of updating the components of a distributed application and its cross-platform and allows operating with modularity properties in new conditions of distributed computing when inter-module communication is provided only through the message passing mechanism. The designed software platform for the offered approach automatizes both the development of a distributed microservice application based on an applied program package and the organization of decentralized composition management of microservices. New mechanisms for deploying and updating microservices support the synchronization of cloud knowledge bases and one installed on a user’s computer, providing an additional opportunity using Dew computing paradigm that combines the concept of Cloud computing with the capabilities of the user's local computers. The practice of solving computationally complex exhaustive problems (in particular, Boolean satisfiability and qualitative research of binary dynamic systems) has shown the effectiveness of the developed approach in comparison with existing ones.
|
Papers |
A. Mujezinović, V. Ljubović (University of Sarajevo, Faculty of Electrical Engineering, Sarajevo, Bosnia and Herzegovina) Serverless Architecture for Workflow Scheduling with Unconstrained Execution Environment
Cloud computing, together with all its boons, has taken the world of computing by a storm, reaping benefits across various domains. Introduction of serverless computing, and more precisely Function-as-a-Service (FaaS), removed many orchestration and maintenance issues that system designers were facing. This inspired an emerging field of research on utilization and optimization of serverless computing. Existing body of work on this topic is, to the best of authors knowledge, focused on using serverless functions (e.g. AWS Lambda, Google Cloud Functions) exclusively. Such functions suffer constraints in the context of their execution environment, time or available space. In this paper we present a more general approach that improves upon existing architectures that revolve around cloud functions. By leveraging AWS Fargate technology, we propose a fully serverless and infinitely scalable architecture that is based on producer-consumer pattern and can be shaped to satisfy wide range of requirements. As worker nodes, Docker containers are used, which helps avoid aforementioned constrains. This concept is put to use in a system for acquisition of high frequency data.
|
V. Stanisavljević (Sveučilište Sjever, Varaždin, Croatia) Comparison of Data Aggregation from a Wireless Network of Sensors Using Database Sharding and Foreign Data Wrappers
In this work two built-in database technologies were compared and applied to aggregation of data from a network of sensor equipped data collection nodes in a real manufacturing environment. The methods investigated are: data sharding and foreign data wrappers. The availability and feature set differences of the methods on some popular database management systems was compared and then a detailed case study on PostgreSQL DBMS was performed. For testing the concept in more realistic environment the sensor network was implemented using a number of Raspberry Pi computers with local relational database and with wireless connectivity to main aggregation database running on a much powerful server computer. Both database methods were applied on all nodes and the server and performance was measured for different types of aggregation queries. The very same data aggregation concepts presented here could be applied to any sensor network or to any Internet of things network.
|
A. Šerifović-Trbalić (University of Tuzla, Faculty of Electrical Engineering, Tuzla, Bosnia and Herzegovina), A. Trbalić (Drvodom, Tuzla, Bosnia and Herzegovina), D. Demirović, E. Skejić (University of Tuzla, Faculty of Electrical Engineering, Tuzla, Bosnia and Herzegovina), D. Gleich (University of Maribor, Maribor, Slovenia) CT Metal Artifacts Reduction Using Convolutional Neural Networks
Artefacts caused by the presence of metallic implants and prosthesis appear as dark and bright streaks in computed tomography (CT) images, that obscure the information about underlying anatomical structures. These phenomena can severely degrade the image quality and hinder the correct diagnostic interpretation. Although many techniques for the reduction of metal artefacts have been proposed in literature, their effectiveness is still limited. In this paper, an application of a convolutional neural networks (CNN) to the problem of metal artefact reduction (MAR) in the image domain is investigated. Experimental results show that image-domain CNN can substantially suppresses streaking artefacts in the reconstructed images.
|
O. Bilalovic, Z. Avdagic, M. Kafadar (Faculty of Electrical Engineering, Sarajevo, Bosnia and Herzegovina) Improved Nucleus Segmentation Process Based on Knowledge Based Parameter Optimization in Two Levels of Voting Structures
Digital analysis and biomedical image processing has taken huge part within modern medicine and biology, now more than ever. Digital pathology is just one of many affected medicine areas that is being upgraded by constant biomedical engineering research and development. It is very important that some of disciplines like nucleus detection, image segmentation or classification become more and more effective, with minimum human intervention on these processes, and maximum accuracy and precision. In this paper is presented improved optimization of nucleus segmentation methods parameters based on two levels of voting processes. First level includes hybrid nucleus segmentation based on 7 segmentation algorithms: OTSU, Adaptive Fuzzy-c means, Adaptive K-means, KGB (Kernel Graph Cut), APC (Affinity Propagation Clustering), Multi Modal and SRM (Statistical region merging) based on optimization of algorithms parameters along with implemented first level voting structure. Second level voting structure includes segmentation results obtained in the first level of voting structure in combination with 3rd party segmentation tools: ImageJ/Fiji and MIB (Microscopy Image Browser). A definite segmented image of a nucleus could serve as a generic ground truth image because it is formed as a result of a consensus based on several different methods of segmentation and different parameter settings, which guarantees better objectivity of the results. In addition, this approach can be used with great scalability on 3D-stack image datasets.
|
T. Široki (FER, Zagreb, Croatia), S. Koska, N. Čorak, M. Futo, T. Domazet-Lošo (IRB, Zagreb, Croatia), M. Domazet-Lošo (FER, Zagreb, Croatia) Correspondence Analysis Applied to Large Scale Evo-Devo Data
Correspondence analysis (CA) is an exploratory method used for visualizing and understanding high dimensional data. It is mostly applied to contingency tables. Although the method has been used in various fields for decades, including bioinformatics, to our knowledge it has not yet been applied for the evolutionary analysis of transcriptome and proteome data during development of organisms. In this paper we present the application of the CA method to linking evolutionary age of genes and their expression data (transcriptome and proteome). Evolutionary age of genes was obtained by phylostratigraphic approach and expression data were collected during the biofilm growth in the model bacteria Bacillus subtilis. The obtained results allow evolutionary meaningful interpretation of gene expressions in Bacillus subtilis biofilm development.
|
V. Bojović, B. Lučić (Ruđer Bošković Institute, Bijenička c. 54, Zagreb, Croatia), D. Bešlo (Josip Juraj Strossmayer University of Osijek, Faculty of Agriculture in Osijek, Vladimira Preloga 1, Osijek, Croatia), K. Skala (Ruđer Bošković Institute, Bijenička c. 54, Zagreb, Croatia), N. Trinajstić (Croatian Academy of Sciences and Arts, Zrinski trg 1 and Ruđer Bošković Institute, Bijenička c. 54, Zagreb, Croatia) Calculation of topological molecular descriptors based on degrees of vertices
Topological molecular descriptor is calculated by a mathematical procedure from the structure of chemical compounds represented by a molecular graph, containing information about their structural characteristics [1]. There are many defined topological descriptors in literature which transforms specific chemical information into useful numerical values that have been used for correlation of structure with various physico-chemical properties [2]. Among the first topological descriptors that accelerate further development of the field are called Zagreb indices M1 and M2 that were introduced by Gutman and Trinajstić in 1972 [3]. Vertices of a molecular graph represent atoms and edges connecting vertices of a graph represent carbon-carbon (or non-hydrogen) chemical bonds. The total number of edges connecting one vertex with its (first) neighbour carbon vertices corresponds to valences of that atom with other carbons with in a molecule. One of the largest class of topological descriptors is the one based on the analysis of vertex degrees of a molecular graph [4]. Starting from electronic version of chemical structures we will develop and optimize an application for calculation of different vertex-degree-based topological descriptors and analyse their approximate complexity.
[1] R. Todeschini and V. Consonni, Molecular Descriptors for Chemoinformatics, Wiley-VCH, 2009.
[2] N. Trinajstić, Chemical Graph Theory, CRC Press, Boca Raton, 1992.
[3] I. Gutman and N. Trinajstić, Chem. Phys. Lett. 17 (1972) 535.
[4] I. Gutman, Croat. Chem Acta, 86 (2013) 351–361.
|
|
Basic information:
Chairs:
Karolj Skala (Croatia), Roman Trobec (Slovenia), Davor Davidović (Croatia)
Steering Committee:
Enis Afgan (Croatia), Lene Krøl Andersen (Denmark), Viktor Avbelj (Slovenia), Marian Bubak (Poland), Davor Davidović (Croatia), Matjaž Depolli (Slovenia), Tiziana Ferrari (Netherlands), Montserrat Gonzalez (England), Simeon Gracio (Croatia), Marjan Gusev (Macedonia), Vojko Jazbinšek (Slovenia), Aneta Karaivanova (Bulgaria), Zalika Klemenc-Ketiš (Slovenia), Gregor Kosec (Slovenia), Dieter Kranzlmüller (Germany), Maria Lindén (Sweden), Tomislav Lipić (Croatia), Ludek Matyska (Czech Republic), Željka Mihajlović (Croatia), Jesús Carretero Pérez (Spain), Dana Petcu (Romania), Tonka Poplas Susič (Slovenia), Aleksandra Rashkovska Koceva (Slovenia), Karolj Skala (Croatia), Uroš Stanič (Slovenia), Ivan Tomašić (Sweden), Roman Trobec (Slovenia), Tibor Vámos (Hungary), Matjaž Veselko (Slovenia), Yingwei Wang (Canada), Roman Wyrzykowski (Poland)
Registration / Fees:
REGISTRATION / FEES
|
Price in EUR
|
EARLY BIRD
Up to 6 May 2019
|
REGULAR
From 7 May 2019
|
Members of MIPRO and IEEE |
200 |
230 |
Students (undergraduate and graduate), primary and secondary school teachers |
120 |
140 |
Others |
220 |
250 |
The discount doesn't apply to PhD students.
Contact:
Karolj Skala
Rudjer Boskovic Institute
Center for Informatics and Computing
Bijenicka 54
HR-10000 Zagreb, Croatia
E-mail: skala@irb.hr
The best papers will get a special award.
Accepted papers will be published in the ISSN registered conference proceedings. Papers presented at the Conference will be submitted for posting to IEEE Xplore.
Authors of outstanding papers will be invited to submit the extended version of their papers to a special issue of Scalable Computing: Practice and Experience (ISSN 1895-1767) published in the first quarter of 2020.
Location:
Opatija, with its 170 years long tourist tradition, is the leading seaside resort of the Eastern Adriatic and one of the most famous tourist destinations on the Mediterranean. With its aristocratic architecture and style Opatija has been attracting renowned artists, politicians, kings, scientists, sportsmen as well as business people, bankers, managers for more than 170 years.
The tourist offering of Opatija includes a vast number of hotels, excellent restaurants, entertainment venues, art festivals, superb modern and classical music concerts, beaches and swimming pools and is able to provide the perfect response to all demands.
Opatija, the Queen of the Adriatic, is also one of the most prominent congress cities on the Mediterranean, particularly important for its international ICT conventions MIPRO that have been held in Opatija since 1979 gathering more than a thousand participants from more than forty countries. These conventions promote Opatija as the most desirable technological, business, educational and scientific center in Southeast Europe and the European Union in general.
For more details please look at www.opatija.hr/ and www.visitopatija.com.
|
|
|
Currently there are no news |
|
|
|
|