|
IEEE zadržava pravo neobjavljivanja radova u bazi IEEE Xplore ukoliko radovi nisu prezentirani na skupu (odnosi se samo na radove na engleskom jeziku).
Data Science |
Pozvano predavanje |
T. Kiss (University of Westminster, London, United Kingdom) Scalable Multi-cloud Platform to Support Industry and Scientific Applications
Cloud computing offers resources on-demand and without large capital investments. As such, it is attractive to many industry and scientific application areas that require large computation and storage facilities. Although Infrastructure as a Service (IaaS) clouds provide elasticity and on demand resource access, multi-cloud capabilities and application level scalability are still largely unsolved. The CloudSME Simulation Platform (CSSP) extended with the Microservices based Cloud Application level Dynamic Orchestrator (MiCADO) solves such issues. CSSP is a generic multi-cloud access platform for the execution of large scale industry and scientific simulations on heterogeneous cloud resources. MiCADO provides application level scalability to optimise execution time and costs. This presentations outlines how these technologies have been developed in the various European research projects, and showcases several application case-studies from manufacturing, engineering and life-sciences where these tools have been successfully utilised to execute large-scale simulations in an optimised way on heterogeneous cloud infrastructures.
|
Radovi
|
R. Jugas, M. Vítek, K. Sedlář, H. Škutková (Brno University of Technology, Brno, Czech Republic) Cross-Correlation Based Detection of Contigs Overlaps
Increasing demand for genomic data stress the development of new sequencing techniques and assembly methods. While the sequencing techniques are the biologist domain, the genome assembly is bioinformatical task and development of new assembly algorithms responds to the new sequencing methods. The final part of the assembly process is merging the contigs and find their position in the genome. Contigs are almost the final product but they can contain errors and features induced by previous assembly process. The current methods use string algorithms based on dynamic programming computing with characters (A, C, G, T) representing nucleotides, but if applied to long sequences, e.g. contigs, they tend to be time-consuming. We applied another approach based on genomic signal processing to evaluate the further merging and overlaps between the contigs. The genomic signal form of DNA sequence can reveal hidden features of sequences and digital signal processing methods can be applied. Also, the computational complexity of task can be reduced by implementing massive downsampling. We use our own implementation of cross-correlation based on Pearson correlation coefficient to detect possible overlaps between contigs, when high positive correlation indicates possible shared regions of the contigs but also to denote the position of that region, without the alignment.
|
E. Erlingsson (University of Iceland, Reykjavik, Iceland), G. Cavallaro, A. Galonska, M. Riedel (Juelich Supercomputing Centre, Juelich, Germany), H. Neukirchen (University of Iceland, Reykjavik, Iceland) Modular Supercomputing Design Supporting Machine Learning Applications
The DEEP-EST (DEEP – Extreme Scale Technologies) project designs and creates a Modular Supercomputer Architecture (MSA) whereby each module has different characteristics. The design of these modules is driven by scientific applications from different domains whereby this paper focusses on machine learning in the remote sensing application domain but uses methods like support vector machines that are also used in life sciences and other scientific fields. One of the challenges in remote sensing is to classify land cover into distinct classes based on hyperspectral datasets obtained from airborne and satellite sensors. One dataset used is the Indian Pines AVIRIS dataset over an agricultural site composed of agricultural fields with regular geometry (200 spectral bands, 1417x617 pixels, spatial resolution of 20 meter, 52 classes of different land cover). Before classification is performed the raw hyperspectral data is used with feature engineering techniques that in this case is the Self Dual Attribute Profile (SDAP). The classification is performed via a tuned version of the piSVM MPI code that consists of a parallel Support Vector Machine (SVM) including kernel methods. The full paper outlines how these type of applications benefit from the DEEP-EST technologies. First, the training dataset and testing dataset of the remote sensing application is used many times in the process and make sense to put into the DEEP-EST Network Attached Memory (NAM) module. Afterwards, training with piSVM in order to generate a model requires powerful CPUs with good interconnection for the inherent optimization process and thus can take advantage of the DEEP-EST CLUSTER module (use of training dataset, requires piSVM parameters for kernel and cost). Thirdly, instead of dropping the trained SVM model (i.e. file with support vectors) to disk it makes sense to put this model into the DEEP-EST NAM module for quick re-use. Then testing with piSVM in order to evaluate the model accuracy requires not powerful CPUs and not a good interconnection but scales perfectly (i.e. nicely parallel) and thus can take advantage of the BOOSTER module (use of testing dataset & model file residing in NAM) and more distributed computing. Finally, if accuracy is too low the work is back to change parameters and repeat the process. Another examle is cross-validation. Initial experiments performed with training and testing shows that a parameter space search is required in order to perform model selection (i.e. validation). Validation requires a validation dataset or again the training dataset when using cross-validation (low bias) but instead of reading the training data from a file again and again it can be placed in the DEEP-EST NAM. Afterwards, n-fold cross-validation over a grid of parameters (kernel, cost) performs an estimate of the out-of-sample performance and performs n-times independent training process on a “folded” subset of the dataset (use of training data in folds). n-fold Cross-Validation (e.g. 10-fold often used) with piSVM is partly computational intensive whereby each fold can be nicely parallelized without requiring a good interconnection and thus can take advantage of the DEEP-EST CLUSTER module (use of training data in folds) whereby results of each fold per parameter can be put in the DEEP-EST NAM module. Finally, the best parameters w.r.t. MAXIMUM accuracy in all the folds across all the parameter spaces can be computed using the DEEP-EST NAM module (FPGA computing maximum). To conclude the process, the best parameter set that resides in the DEEP-EST NAM is given as input to the training/test pipeline for real model building and deployment.
|
F. Hržić, V. Jansky, D. Sušanj (Tehnički fakultet, Rijeka, Croatia), G. Gulan (Medicinski fakultet, Rijeka, Croatia), I. Kožar (Građevinski fakultet, Rijeka, Croatia), Ž. Jeričević (Tehnički fakultet, Rijeka, Croatia) Information Entropy Measures and Clustering Improve Edge Detection in Medical X-Ray Images
Shannon information entropy measures and hierarchical agglomerative clustering were used to detect edges in digital images. The concept is based on communications theory with splitting of edge detection kernel into source and destination parts. The arbitrary shape of the kernel parts and the fact that information filter output is a real number with reduced problem of edge's continuity represents the major advantage of this approach.
The methodology was applied globally (the same information entropy parameters were used on a whole image), and locally (adapting edge detection algorithm to localized, kernel size computed information context). The results indicate that using local information context brings out more details but also enhances the noise.
The real life examples are taken from medical X-Ray imaging of series of femur in order to illustrate the algorithm performance on real data.
|
L. Bento (Instituto de Telecomunicações, Leiria, Portugal), L. Távora (Polytechnic Institute of Leiria, Leiria, Portugal), P. Assunção, S. Faria, R. Fonseca-Pinto (Instituto de Telecomunicações, Leiria, Portugal) Using Local Binary Patterns in Speckle Image Analysis
Firstly described by Newton in the 17th century, speckle is an optical phenomenon which can be translated into image patterns produced by wave interferences of diffused reflections. In fact, the speckle pattern is generated by the multiple interference phenomena that occurs when a rough surface is illuminated with a coherent source of light, producing randomly distributed reflected waves of the same frequency but different phases and amplitudes. Although it has been known for a long time, capturing video sequences of speckle patterns was dependent on recent technological developments, in particular, related to laser technology and microsensors.
The speckle acquisition setup comprises a light source, usually a laser, an optical beam expander and a CCD camera. The generated interference patterns are captured in series of video sequences, to further be processed.
In previous works, several image processing algorithms have been applied to analyze video frames of speckle, aimed to capture the evolution patterns in dynamic processes. However, due to the typical high frequencies of the changing patterns, classical texture algorithms mostly fail this goal.
In this work, speckle dynamics are evaluated using Local Binary Patterns (LBP) jointly with some of its main variants and a newly proposed algorithm, in a reactive hyperemia controlled test. The proposed methodology goes beyond the traditional implementations of LBPs, by considering an additional Gaussian filtering, a methodology thus coined as LBPg. The results, on one hand, confirm that the classical formulations of LBP ares not sensitive to changes in the simulated patterns but, on the other hand, demonstrate that the newly proposed LBP-adapted algorithms successfully identify the dynamics of the processes under study.
|
J. Slak, G. Kosec ("Jožef Stefan" Insitute, Parallel and distributed systems laboratory, Ljubljana, Slovenia) Parallel Coordinate Free Implementation of Local Meshless Method
This paper presents an implementation of a Meshless Local Strong Form Method that allows users to write elegant code expressed in terms of more abstract mathematical objects, such as operators and fields, avoiding working directly with matrix and array indices which is tedious and error prone. This is achieved by using object oriented techniques for definition of abstract concepts and leveraging C++'s powerful templating mechanism to allow for type agnostic interdependence. It is demonstrated that code written this way has little-to-no performance overhead compared to classical numerical codes while offering more expressive power and readability. It can also be efficiently parallelized over spatial dimensions. The overall functionality is illustrated on model examples from classical thermodynamics, linear elasticity and fluid dynamics in one, two and three dimensions.
|
L. Krøl Andersen, Kgs. Lyngby, Denmark), B. Vedel Thage (Danish eInfrastructure Cooperation (DeiC), Kgs. Lyngby, Denmark), A. Syed (Technical University of Denmark; National Life Science Supercomputing Center, Kgs. Lyngby, Denmark), B. Andersen (The Royal Danish Library, Cultural Heritage Cluster, Aarhus, Denmark), P. Løngreen (Technical University of Denmark; National Life Science Supercomputing Center, Kgs. Lyngby, Denmark), K. Gammelgaard Nielsen (University of Southern Denmark, Dept. of IT-service, Odense, Denmark), S. Pedersen (Danish eInfrastructure Cooperation (DeiC), Kgs. Lyngby, Denmark) National Supercomputing in Denmark
National supercomputing was introduced in Denmark in 2014. Three national supercomputers were jointly funded by the Danish Government, Ministry of Higher Education & Science, the larger universities in Denmark and the Royal Danish Library. National compute facilities were to fill the gap of computing resources between local university resources, Nordic, European and Internationally. However, more importantly the goal was to give all researchers in Denmark equal access to computing resources in order to meet and qualify within the foreseeable future of Big Data. Three years have passed and the amount of national HPC users is steadily increasing every day. Workflows across university borders have been established, national payment models are in place and eScience expert hubs and centers are forming, adjusting to local needs and resources available. All in all, eScience is forming its landscape in Denmark. This paper illustrates the journey Denmark has been through in establishing and integrating national supercomputing into its research culture. Previously, such significant compute resources were strong competitive research parameters. They are now national collaborative facilities opening up for newcomers to HPC, increasing interdisciplinary research, transferring of HPC expertise between scientific disciplines etc. Furthermore, this paper addresses the challenges and successes in reaching this cornerstone for Denmark and in addition identifies the impact of national supercomputing on science in Denmark.
|
Pauza |
|
J. Opiła, G. Opiła (AGH University of Science and Technology, Kraków, Poland) Visualization of Computable Scalar 3D Field Using Cubic Interpolation or Kernel Density Estimation Function
Visualization of data, both computed or empirical, is an important part of knowledge acquirement process. Although computed data e.g. molecular fields is usually well structured for advanced visualization including volume rendering often an enormous computation power is required for efficient data presentation including but not limited to visualization by means of isosurface. In order to reduce required computation power various algorithms have been developed. In the paper three of them, namely tri-linear interpolation, tri-cubic interpolation and approximation employing kernel density estimation function, have been tested and compared against direct isosurface solution in respect to computation time, accuracy and visual appearance. All examples were computed using hybrid data visualization styles using advanced texturing as part of data presentation. For fast prototyping of visual styles and computation of visual examples POVRay completed with newest version of ScPovPlot3D toolkit has been used.
|
J. Opiła, T. Pełech-Pilichowski (AGH University of Science and Technology, Kraków, Poland) Visualization of Irregular Datasets Using Kernel Density Estimation Function
Visualization of empirical data is an important part of knowledge acquirement process. Numerous visualization techniques are thus employed including surface and volume rendering. Usually algorithms of visualization require data to be organized in a specific manner. Unfortunately data accumulated empirically is often amorphous i.e. does not exhibit any internal regularity e.g. due to varying spatial density of samples resulting from natural constraints or properties of apparatus. In order to cope with this problem several preprocess procedures have been developed including distance-like methods very well suited for parallel computation. In the paper are thus discussed significance and problems connected with data preprocessing for robust data analysis, hybrid data visualization styles using advanced texturing as part of data presentation for time series prediction, event detection and other data analysis purposes. For fast prototyping of visual styles and computation of visual examples POVRay completed with newest version of ScPovPlot3D toolkit has been used.
|
B. Dušić, D. Beževan, D. Pinčić, Ž. Jeričević (Tehnički fakultet, Rijeka, Croatia) Automatic Focusing of Optical Microscope Using Off-the-Shelf Hardware Components and In House Software
Optical microscope is one of the most important instruments in medical diagnostic. It is also one which requires high mental concentration and eye strain for extended periods of time. To make a microscope into real high throughput instrument, as much as possible routine work has to be shifted to the computer. Some commercial systems exists, but for a premium price. The system developed here is proof of concept that the required performance can be achieved with the reduction in cost for two orders of magnitude.
Two computer systems were used: Desktop PC for data collection and analysis and indirect hardware control through the communications with a Raspberry Pi computer which controlled the step motors.
Our first effort was directed toward focusing of the microscope (Z-axis control). In plane control (X & Y-axis) is conceptually much simpler. The focal point was found using Haar transform with inter and intra application of the least squares.
The real life examples were taken from medical imaging of stained human cells as they are used in medical pathology.
|
N. Tomikj, M. Gusev (University Sts Cyril and Methodius, Skopje, Macedonia) Parallel Matrix Multiplication
Utilizing all CPU cores available for numerical computations is a topic of considerable interest in HPC. This paper analyzes and compares four different parallel algorithms for matrix multiplication without block partitioning using OpenMP: Row First (RF) parallelization, Column First (CF) parallelization, Row by Row (RR) multiplication, and Matrix Transposing first (MT) approach.
The comparison of the algorithms is based on the achieved speed, calculated as operational bandwidth and efficient use of the cache of the algorithms.
Although the MT algorithm is the best and the most reasonable choice, the other algorithms can be considered depending on the programming language used, and the size of the matrix.
The goal was to analyze and compare the algorithms in details with an explanation of the impact of the architecture and specification of the number of threads.
A careful programmer may choose which algorithm to use depending on the situation, how many threads should be used, what are the benefits of the threads and the negative consequences that arise by creating more than enough threads.
Not necessarily large number of threads means more performance. This paper explains how much threads to use depending on the processor used for executing the algorithms.
|
F. Došilović (Student at University of Zagreb Faculty of Engineering and Computing, Zagreb, Croatia), M. Brčić, N. Hlupić (University of Zagreb Faculty of Engineering and Computing, Zagreb, Croatia) Explainable Artificial Intelligence: A Survey
In the last decade, with availability of large datasets and more computing power, machine learning systems have achieved (super)human performance in a wide variety of tasks. Examples of this rapid development can be seen in image recognition, speech analysis, strategic game planning and many more. The problem with many state-of-the-art models is a lack of transparency and interpretability. The lack of thereof is a major drawback in many applications, e.g. healthcare and finance, where rationale for model's decision is a requirement for trust. In the light of these issues, explainable artificial intelligence (XAI) has become an area of interest in research Community. This paper summarizes recent developments in XAI in supervised learning, starts a discussion on its connection with artificial general intelligence, and gives proposals for further research directions.
|
Data Science |
Radovi |
K. Sahatqija, J. Ajdari, X. Zenuni, B. Raufi, F. Ismaili (SEEU, Tetovo, Macedonia) Comparison between Relational and NOSQL Databases
The relational database is the traditional DBMS which ensures the integrity of data and consistency of transactions. For many software applications, these are the principles of a proper DBMS. But in the last few years, seeing the velocity of data growth and the lack of support from traditional databases for this issue, as a solution to it the NoSQL (Not Only SQL) databases have appeared. These two kinds while being used for the same reasons (create, retrieve, update and manage data) they both have their own advantages and disadvantages over each other. Hence, this study will tackle and compare the research question of what are the pros and cons and what are each database’ features and characteristics? This paper is a qualitative research, based on detailed and intensive analysis of the two database types, through use and compare some published papers and materials (from many authors) during last few years.
|
V. Bogdanova, S. Gorsky (Matrosov Institute for System Dynamics and Control Theory of Siberian Branch of RAS, Irkutsk, Russian Federation) Scalable Parallel Solver of Boolean Satisfiability Problems
One of the current trends in high- performance computing (HPC) is applying its possibilities to solve the Boolean satisfiability problem (SAT). SAT is the fundamental problem of mathematical logic and the computational theory. Many of the most important Data and Life Sciences problems can be formulated as SAT, in particular, the Regulation in Animals and Plants problem in Bioinformatics. Traditionally two approaches to the parallel solution of SAT are used, competitive and cooperative. We propose a new massive parallel SAT solver Hpcsat implemented using MPI technology based on the second approach. We describe the architecture and functionality of the solver and the toolkit for automation of the computational experiments process. The approving Hpcsat scalability results of computational experiments are represented. The results of the offered solver confirmed the advantage of Hpcsat in comparing with the existing analogous massive parallel HordeSat solver.
|
P. Brezany (University of Vienna, Faculty of Computer Science, Vienna, Austria), M. Janatova (Charles University, Prague, Czech Republic), O. Stepankova, M. Uller (FEL CVUT, Department of Cybernetics, Prague, Czech Republic), M. Lenart (Universita of Vienna, Faculty of Computer Science, Vienna, Austria) Towards Precision Brain Disorder Rehabilitation
Brain disorders include any conditions or disabilities that affect the brain. Some frequently occurring brain disorders (caused by e.g., injury or stroke) are associated with balance disorders that can be to some extent compensated by highly personalized and long term rehabilitation. This paper presents basic principles and reports the first results of our project focused on design and development of new generation data-centric approach to treatment of balance disorders through exergames offering many levels of difficulties that can be adjusted through choice of their parameters. The suggested approach paves the way to future precision rehabilitation models, in which the significant part of the training can be ensured through game like exercise supervised by an intelligent adaptive system called Intelligend Therapy Assistant (ITA) providing the patient with biofeedback. Further, the supervising ITA iteratively modifies parameters of the next run of the game (adapts them) according to the current needs of the individual patient in order to keep the patient motivated – this is ensured by introduction of a feedback loop between the ITA system and thorough evaluation of results achieved by the patient during the therapy process.
|
A. Feoktistov, R. Kostromin, I. Sidorov, S. Gorsky (Matrosov Institute for System Dynamics and Control Theory of SB RAS, Irkutsk, Russian Federation) Development of Distributed Subject-Oriented Applications for Cloud Computing through the Integration of Conceptual and Modular Programming
The paper address the relevant problem related to the development of distributed object-oriented applications to solve large-scale scientific and applied problems in a public access computer center. The center includes dedicated resource for cloud computing (private cloud) and non-dedicated resources for traditional distributed computing. Applications generate flows of computational jobs that determine the requirements (problem-solving time, size of memory and disks, number of nodes, processors and cores, as well as used program libraries, compilers, their keys, images of virtual machines, etc.) for a computational system that are necessary for solving problems. In general, a job includes a set of interrelated subjobs. We propose a new approach to the development of applications. It based on the integration of conceptual and modular programming. This approach includes the following main stages: structural analysis of a subject domain, creating its specification, developing the applied software, configuring the system software, and executing jobs based on multi-agent management of computational processes. We also creates two toolkits to support proposed approach and develop applications. These toolkits differ between themselves by their functionality and features for describing processes of distributed computing. Applications software is based on the use of knowledge embedded in conceptual model of the environment. It describes the parameters of subject domains, operations over parameter field, software modules (applied software) that implement operations, computational nodes, communication channels and other objects of both the subject domains and environment, including the relationships between objects. In comparison to the known tools, used for the development and execution of distributed applications in the current practice, the created toolkits provide executing application jobs in the integrated environment of virtual machines that include both the dedicated and non-dedicated resources of the center. Moreover, users of the applications, developed with the help of our toolkits, can formulate problems in various forms (procedural or non-procedural, complete or incomplete) with uncertainty in parameters, operations, modules or resources that can be used to solve problems. In the toolkits, there are the traditional languages for graphical and text specifications of objects for the conceptual model, and the specialized languages for monitoring and processing the special events occurring in cloud computing. The system software of the applications provides the automation of creating problem-solving schemes (workflows) for all forms of problem formulations and generating jobs for executing these schemes. Experiments of solving large-scale practical problems of optimization of multiextremal functions and warehouse logistics show advantages of the developed applications and opportunities of our toolkits to adjust to specifics of subject domains.
|
M. Kozlovszky, P. Bogdanov, K. Karóczkai, Z. Garaguly (Obuda University, Budapest, Hungary), G. Kronreif (ACMIT GmbH, Wiener Neustadt, Austria) IMU Based Human Movement Tracking
The paper describes an inertial measurement unit based movement tracking hardware device, which can be used for human movement tracking. Tracking the movement of the patient is a difficult but very important task during movement rehabilitation.
Our realized software and hardware environment employs a multi-sensor fusion based solution, where the data is taken from multiple axis accelerometer and gyroscope sensor devices. The solution is under testing in various environments, and some evaluation results of its usage are detailed in our paper.
|
Biomedical Engineering |
Radovi |
S. Ostojić, V. Srhoj-Egekher (Faculty of Electrical Engineering and Computing, Zagreb, Croatia), S. Peharec (Polyclinic Peharec, Pula, Croatia), M. Cifrek (Faculty of Electrical Engineering and Computing, Zagreb, Croatia) A Non-arbitrary Definition of the Analyzing Interval of sEMG signal Measured during Sustained Low Back Extension
Recording of surface EMG signals above lower back muscles during sustained exercise protocols used for classification of different categories of low back pain subjects and healthy subjects are typically accompanied with measurement of kinematic parameters elaborating measurements per se and the interrelationship of surface EMG parameters and kinematic parameters. We propose an exercise protocol without measurement of kinematic parameters and examine the possibility to use spectral changes of the surface EMG signals during fatiguing static contractions in defining a time interval to be analyzed. The protocol is applied to fifteen subjects without low back pain and able to sustain long static contractions over 100 s. Variability of the slope of regression line that fits values of median frequency of the power spectral density function in a least-square sense is checked versus the duration of analyzed interval and start time selected. This study proposes that muscle contraction start time can be related to distinct spectral change of the surface EMG signal that are consequence of designed test protocol.
|
Z. Djinovic (ACMIT Gmbh, Wiener Neustadt, Austria), M. Tomic (ETF, University of Belgrade, Belgrade, Serbia), R. Pavelka (ENT specialist, Wiener Neustadt, Austria), G. Sprinzl (University Hospital St. Pölten, St. Pölten, Austria) A Comparative Study on Miniature Retroreflectors for Totally Implantable Hearing Aids Using an Optical Detection of the Incus Vibration
Optical interferometric detection of the middle ear ossicle vibration is used to get an acoustical signal needed for a totally implantable hearing aid (TIHA). In order to improve the reflectivity, we have attached a lightweight piece of retroreflector to the incus. Three different kinds of retroreflectors were investigated here, using a modified fiber-optic Michelson interferometer, based on a single mode 3x3 coupler and a high-coherence VCSEL light source. The sinusoidal vibration of the target was produced by a PZT, in the acoustic frequency range from 100 Hz to 8 kHz, with amplitudes in the range of several picometers to hundreds of nanometer. The quadrature signals were captured and demodulated while the position and incident angle of the probing beam to the retroreflector were changing. The evaluation of retroreflectors and probe beam collimating techniques was made in respect of the maximum signal-to-noise ratio and the minimum detectable amplitude of vibrations.
|
Pauza |
|
A. Rashkovska Koceva, V. Avbelj (Jožef Stefan Institute, Ljubljana, Slovenia) Three-year Experience with a Wireless ECG Sensor
The success of performing long-term ECG measurements depends on how much the device is interfering with the user. A small and unobtrusive sensor on the skin in conjunction with a smartphone today presents a viable option because such a configuration allows also online access to cardiac services. Despite the limited choice of electrode placement, it has been shown that the ECG sensor may replace the ambulatory ECG (Holter ECG) in many cases. In this paper, we present our experience on setting up an ECG sensor to obtain the best information about heart arrhythmias. We also add experience in unconventional use of the sensor in humans and animals, and recommendations for the future use of such devices.
|
B. Širaiy (Jožef Stefan Institute, Ljubljana, Slovenia), U. Stanič (School of Health, Ljubljana, Slovenia), A. Poplas-Sušič (Community Health Centre Ljubljana , Ljubljana, Slovenia), Z. Katkič (School of Health, Ljubljana, Slovenia) Impact Assessment of the Morning Gymnastics “1000 movements” via ECG and Sport Test
In Slovenia every morning around 3000 elderly subjects (average age 68 years) take part in half an hour gymnastics entitled “1000 movements” in the open air. It is organized by a volunteer Society named »Šola Zdravja« (hereinafter in English “School of health” (SOH)). The Society is financially supported by Ministry of Health due to the satisfaction of the gymnastics participants, high yearly growth rate of the membership, the expansion to the new local communities. Namely, healthier elderly population significantly lowers the costs to the Health system. At the same time the positive developments generated the need to estimate the benefits of the morning exercise objectively. The aim of this study is to develop the aforementioned evaluation methodology using the questionnaires, measurement of ECG by wearable body sensors and fitness tests for elderly people. The interdisciplinary project consortium consists of a primary medical institution, a scientific research institute and a gymnastics Society.
|
B. Širaiy (Institut Jožef Stefan, Ljubljana, Slovenia), V. Ilić (Faculty of Sport and Physical Education, University of Belgrade, Belgrade, Serbia), R. Trobec (Institut Jožef Stefan, Ljubljana, Slovenia) Evaluating Telemetric ECG Body Sensor Signal in Exercise Stress Test: Pilot Study
The aim of this pilot study is to find out a maximal heart rate that can be assessed with telemetric ECG body sensors during exercise stress tests. Twenty subjects were randomly divided into two experimental groups. A first group of ten subjects has been tested on a treadmill and a second group of ten subjects on a cycle ergometer. Two different types of electrodes and two different sensor positions have been used on each subject by a concurrent use of two ECG sensors. Every subject made two exercise tests on the same exercise device: in the first tests both ECG sensors are attached to self-adhesive skin electrodes on the body, while in the second test, with the same electrodes, sensors are additionally fixed with self-adhesive tapes. The obtained measurements have been compared on both exercise devices, i.e., treadmill and cycle ergometer, regarding the type of electrodes, ECG sensors’ positions, and the sort of sensors fixation. The results show a significant difference between non-fixed and fixed electrodes at sensor position LI on a treadmill (p=0,05), and between non-fixed positions LI and LS (p=0,001), and fixed positions LI and LS (p=0,024) on a cycle ergometer. The heart rate has been superiorly detected on the cycle ergometer and with the ECG sensor in position LI, either with non-fixed or fixed electrodes (p=1,000).
|
M. Mohorčič, M. Depolli (Jožef Stefan Institute, Ljubljana, Slovenia) Practical Usage of IJS VisEcg Framework for Processing ECG Data
Processing of medical data has been long reserved for researches in the field.
There were numerous attempts at creating unified tools for reading and processing them, however existing solutions are either cumbersome to use or proprietary with bad documentation.
Team at Jožef Stefan institute produced a new framework for interaction with ECG domain problems.
VisEcg framework aims to simplify process of gathering and processing of ECG medical data, and therefore ease the learning curve for new researches in the field.
Its design enables a standardized access to multiple file formats and interfaces currently used in the field of ECG research.
We developed simple applications using the framework as a core interface to access, process and write data.
In the article we present our implementation and evaluate usability and functionality of the framework and produced applications.
|
D. Tomić (Sveučilište u Rijeci, Rijeka, Croatia) Calculating the Effectiveness of Some Important Anticancer Herbal Compounds against the Main Hallmarks of Cancer
There is a mounting evidence coming from numerous in vitro and in vivo experiments that certain herbal compounds express strong activity against various types of cancer cells. Curcumin from Curcuma longa, resveratrol from Vitis vinifera, and artemisinin from Artemisia annua are few examples from the much longer list of more than fifty anticancer herbs known as of today. This is mostly due to the extreme complexity of cancer processes that an exact mechanism of anticancer activity of these compounds is still unknown. However, with the advent of powerful supercomputers and the development of sophisticated in silico models of cancer, our chances to understand these processes increased enormously, and now we are in the position to deliver more effective anticancer therapies than ever before. Kyoto Encyclopedia of Genes and Genomes, abbreviated KEGG, is an example of such in silico model of cancer, encompassing among many biological processes cancer as well. From KEGG model of cancer, we were able to identify sixty-two oncogene proteins involved in the development, proliferation, angiogenesis, metastasis and resistance of cancer. From them, we selected those having the largest number of hits in Pubmed database related to each of these cancer hallmarks, and use supercomputer for running docking simulations between them and certain anticancer herbal compounds. Significant docking activity of anticancer herbal compounds against oncogene proteins was found. We hope our research will bring better understanding of how these compounds work against the cancer and establish pathways towards more effective therapies in the future.
|
O. Bilalovic, Z. Avdagic (Faculty of Electrical Engineering, Sarajevo, Bosnia and Herzegovina) Robust Breast Cancer Classification Based on GA Optimized ANN and ANFIS-Voting Structures
With rising cancer rates in world, it is important to incorporate all possible ways in order to prevent, detect, and cure this desease. Breast cancer presents one of those threats, and bioinformatical field must work towards finding models to fight against it, with one of them being creation of classification model for that kind of illness.
Using machine learning techniques in order to make these classifications is one of those ways. It is widely known that ANN and ANFIS can significantly upgrade any kind of classification process, and in that way, help in biomedicine and cancer threatment. Furthermore, more objective models for classification must be developed, regarding both time and resources, in order to get optimal results.
In this paper, GA algorithm that optimize ANN and ANFIS have been used to make classification of breast cancer diagnosis. It is shown that GA optimization of ANFIS and ANN parameters results in creating model with more accuracy comparing to basic classificators. Voting method has been used on such GA ANFIS optimized structure, in order to achieve greater model reliability. Final score of computed models was determined using external validaton, based on 4 most relevant clinical metrics: sensitivity, specificity, accurary and precision.
|
Biomedical Engineering |
Pozvano predavanje |
I. Tomašić (School of Innovation, Design and Engineering, Västerås, Sweden)
Continuous Remote Monitoring of Chronic Obstructive Pulmonary Diseas (COPD) Patients |
Radovi |
A. Vilhar, M. Depolli (Jožef Stefan Institute, Ljubljana, Slovenia) Improving the Dynamics of Off-line Time Synchronization in Wireless Sensor Networks
The time synchronization in wireless sensor networks (WSN) is a well-known issue. One of the approaches to the problem is by using off-line algorithms, meaning that the exchange of signaling messages for synchronization of individual clocks does not take place. Such an approach is typically based on a linear regression, which serves as an estimate of the actual ratio of the clock rates in the network. Due to the ever changing conditions in the network, where the wireless channel reliability and clock rates vary over time, the method needs to adapt dynamically. In the process of adaptation, discontinuities of determined linear regression lines may appear, which causes sudden time offsets. In this article, new procedures are proposed to cope with the varying conditions in such a way as to achieve as seamless transitions as possible. It is shown that unless such procedures are used, time offsets in the order of up to half a second may appear as peaks observed in the transition from one state to another. The proposed method reduces the offsets by an order of magnitude. The effectiveness of proposed solution is demonstrated on a specific case of wireless sensor ECG measurements.
|
L. Bento (Instituto de Telecomunicações, Leiria, Portugal), L. Távora (Polytechnic Institute of Leiria, Leiria, Portugal), P. Assunção, S. Faria, R. Fonseca-Pinto (Instituto de Telecomunicações, Leiria, Portugal) Evaluation of Cutaneous Microcirculation Patterns by Laser Speckle Imaging
Atherosclerosis is a chronic systemic process affecting distal circulation, a condition known as Peripheral Artery Disease (PAD). Epidemiological and clinical studies indicate a strong association between PAD and death due to cardiovascular diseases, thus the early detection of DAP by measuring perfusion levels of distal body regions is seen as a marker of vascular integrity.
This work proposes a new methodology for evaluation of local cutaneous perfusion through laser speckle video processing. The speckle pattern generated by a laser beam projected onto the skin surface after being expanded by an optical setup is captured by a CCD video camera. To evaluate the particular characteristics of the speckle pattern, a video processing algorithm based on an adaptive Local Binary Pattern methodology, highlighted by a local Gaussian filtering scheme, the LBPg, was developed.
In order to test this new methodology of video speckle analysis, different patterns of microcirculation were evaluated in skin regions with different textures.
The experimental results were compared with the ones obtained in two clinical conditions associated with PAD (i.e., Deep Vein Thrombosis and Diabetic foot). The results show that the proposed approach is sensitive to the change in perfusion levels (even in cases with reduced perfusion variations), thus indicating that the use of laser speckle technology, jointly with LBPg, is a promising noninvasive, low cost and sensitive method for the early detection of PAD-related diseases.
|
V. Avbelj (Department of Communication Systems, Jožef Stefan Institute, Ljubljana, Slovenia), M. Brložnik (Small Animal Clinic, Veterinary Faculty, University of Ljubljana, Ljubljana, Slovenia) Heartbeat Interval Dynamics in Response to Acute Stress in Human: A Case Study of Real Fear of Snake
Heartbeat dynamics changes promptly and substantially in response to acute stress. This is due to direct neural connection of the heart and the central nervous system. An opportunity to analyze a video recording synchronized to a simultaneously recorded ECG arose after preparation of educational TV show about adrenaline. Although the scenario for the experiment in Ljubljana Zoo did not presume to expose a person to acute fear, the unexpected situation of awareness of proximity of the snake’s head led to actual sudden acute fear of the snake for the third time in person’s life. This case study presents heartbeat dynamics before and during acute fear phase. The response of the heart to acute fear was immediate. Large and fast variations of heartbeat intervals were observed throughout the whole experiment (heartrate decrease by up to 32 beats/min in 5 seconds), indicating upheaval in regulation of cardiovascular parameters.
|
Ž. Kokelj, C. Bohak, M. Marolt (University of Ljubljana, Ljubljana, Slovenia) A Web-based Virtual Reality Environment for Medical Visualization
In this paper, we present a novel approach to integrating virtual reality (VR) into a web-based medical visualization framework. The framework supports visualization of volumetric data, such as 3D scalar fields acquired by a CT, MRI or PET scanners. To improve users’ perception, understanding and manipulation of 3D volumes, we adapted the traditional 2D screen representation with support for visualization of data in a VR environment. By providing complete visual immersion, VR can help users to gain better insights and understanding of the visualized data. Our main goal was to allow users to view the medical data in VR and interact with it with hand-held controllers for better immersion and spatial perception. In the paper, we present a novel approach to implementation of VR for medical imaging, which combines WebGL-based hardware accelerated web visualization with VR. This allows users to use the visualization framework with or without a VR headset by switching between "standard" and "VR" modes. Since visualization runs in a web browser, it is portable, easy to use on different devices and therefore accessible to a broad number of users. The visualization system was tested with real medical scans to assess its performance and usability.
|
Pauza |
|
M. Gusev, E. Domazet (University Sts Cyril and Methodius, Skopje, Macedonia) Optimal DSP Bandpass Filtering for QRS Detection
An electrocardiogram refers to the process of recording the electrical activity of the heart over a certain time interval. ECG signal holds vital information for the current health condition of the patient. Detection of cardiac disorders is based on detection of sudden deviations from the mean line. Detection of heartbeat functions is based on extracting ECG characteristic features, especially the R-peak.
Although in this paper, we address a general approach, we focus on using wearable ECG sensors and developing an efficient QRS detector to determine the heartbeat function. The real problem in detection and ECG signal analysis is processing the noise contaminated ECG signal and the way one can reduce the feature space to extract the relevant features.
In this paper, we set a research question to investigate how the filter affects the accuracy, sensitivity and precision values on QRS detectors.
We report our findings on optimal filter design with a central frequency of 8.33 Hz and -3db cutoff frequencies at 4 Hz and 20 Hz. The analysis is towards the construction of an efficient filter with small computing complexity intended to be used for wearable ECG sensors.
|
M. Makovec (Novo mesto General Hospital, Novo mesto, Slovenia), U. Aljančič, D. Vrtačnik, B. Pečar (Fakulteta za elektrotehniko, Univerza v Ljubljani, Ljubljana, Slovenia) Evaluation of Peripheral Arterial Occlusive Disease with Arterial Oscillograph Prototype
BACKGROUND:
Arterial oscillography is a non-invasive technique using a pneumatic sensor for measurement of blood volume changes inside artery. A design and fabrication of arterial oscillograph prototype for measurement of peripheral arterial occlusive disease (PAOB) of the lower extremities is introduced together with characterization and evaluation of measurement instrument.
METHOD:
This study enrolled 12 individuals and 20 legs which were evaluated by arterial oscillograph prototype and compared with measured results obtained by angiography, which is the gold standard for diagnosis of PAOB.
RESULTS:
Sensitivity of assessment instrument was 78% and specificity was 76% for detecting PAOB.
CONCLUSION:
The sensitivity and specificity of the instruments are important factors to consider when choosing an instrument.
|
A. Jović (Sveučilište u Zagrebu Fakultet elektrotehnike i računarstva, Zagreb, Croatia), K. Jozić (INA - industrija nafte, d.d., Zagreb, Croatia), D. Kukolja, K. Friganović, M. Cifrek (Sveučilište u Zagrebu Fakultet elektrotehnike i računarstva, Zagreb, Croatia) Parallelization in Biomedical Time Series Analysis Web Platform: the MULTISAB Project Experience
Parallel execution of operations required for biomedical time series (BTS) analysis is an important issue in optimization of medical software efficiency. We investigate the applicability of several parallelization approaches to BTS analysis in the context of feature extraction from multiple heterogeneous BTS on a Java-based web platform designed for medical diagnostics . Considering only the calculation parallelization of many different BTS features, our research suggests that parallelization based on simple Java multithreading works the best. The threads are assigned based on the data and analysis parameters provided, where feature extraction parallelization is performed on the following levels: 1) multiple segments; 2) multiple signal trails; 3) multiple patient records. The synchronization mechanism should be simple: the analysis continues once all threads terminate their work and record the extracted feature vectors. A special case, when features from multiple signal trails (multivariate features) are extracted, includes only multithreading on segment and record levels. We also provide an overview of the web platform architecture to put the parallelized parts into the overall perspective.
|
D. Stamenov, M. Gusev, G. Armenski (Faculty of Computer Science and Engineering, Skopje, Macedonia) Interoperability of ECG Standards
The increased demand for interoperability of electronic health records leads to the need for establishing reliable standards for storing and retrieving electrocardiogram data. Multiple medical record data standards were created in response to medical organization initiatives, based on binary and XML formats. These standards raise the need to maximize interoperability between systems that share ECG datasets. HL7 aECG, SCP-ECG, DICOM and ISHNE are among the most popular ECG standards, which are compared within this paper. We have implemented an adapter system - ECGConvert, which provides interoperability on raw ECG to HL7 aECG and SCP-ECG, while also supporting ISHNE format to HL7 aECG conversion. We provide a discussion for the interoperability of these standards in the healthcare, the problems we have faced and solution to improve the process of sharing ECG datasets between organizations. Our interoperability platform can support the wearable ECG devices that provide single channel ECG streaming data. The metadata raw ECG record can be converted to any of currently used ECG standards by the built prototype.
|
D. Stamenov, D. Venov, M. Gusev (Faculty of Computer Science and Engineering, Skopje, Macedonia) Scalability Performance Evaluation of the E-Ambulance Software Service
E-Ambulance is a Software as a Service (SaaS) solution for computer assisted diagnoses of ECG recordings. It is an expert system, providing doctors and their patients a cloud solution for computer assisted diagnoses of ECG recordings and means of collaborative monitoring and treatment. In this paper, we aim at evaluating the scalability performance of the E-Ambulance SaaS solution by presenting the architecture of the SaaS application and annotation REST service for diagnoses/anamneses, followed by the testing methodology. The overall goal is to evaluate the scalability performance and find the optimal architecture that will offer highest performance for the user.
We have developed a prototype of the main SaaS application and a prototype of one web service to be used within the SaaS application for management of annotations. Both were developed in different technologies to find out if there is any influence on the performance.
|
I. Tomašić (Mälardalen University, Västerås, Sweden), K. Khosraviani, P. Rosengren (CNet Svenska AB, Stockholm, Sweden), M. Jörntén-Karlsson (AstraZeneca R&D, Gothenburg, Sweden), M. Lindén (Mälardalen University, Västerås, Sweden) Enabling IoT Based Monitoring of Patients’ Environmental Parameters: Experiences from Using OpenMote with OpenWSN and Contiki-NG
Remote health monitoring can be leveraged by the IoT paradigm. We tested OpenMotes, state-of-the-art IoT devices featuring IEEE 802.15.4 protocol, with Contiki-NG and OpenWSN, two of the most popular IoT operating systems, for the purpose of obtaining data from the OpenMote’s sensors. The procedure is not strait forward and requires additional programing of the operating systems. All the steps necessary to make the OpenMote work with Contiki-NG are presented in detail. We also describe how to use Copper Firefox add-on to access the sensors data over CoAP, as well as how to design a functional web interface in Node-RED for the sensors data.
|
|
Osnovni podaci:
Voditelji:
Karolj Skala (Croatia), Roman Trobec (Slovenia), Davor Davidović (Croatia)
Voditeljstvo:
Enis Afgan (Croatia), Lene Krøl Andersen (Denmark), Marian Bubak (Poland), Tiziana Ferrari (The Netherlands), Montserrat Gonzalez (England), Aneta Karaivanova (Bulgaria), Dieter Kranzlmüller (Germany), Maria Lindén (Sweden), Ludek Matyska (Czech Republic), Jesús Carretero Pérez (Spain), Dana Petcu (Romania), Karolj Skala (Croatia), Uroš Stanič (Slovenia), Roman Trobec (Slovenia), Tibor Vámos (Hungary), Yingwei Wang (Canada), Roman Wyrzykowski (Poland)
Prijava/Kotizacija:
PRIJAVA / KOTIZACIJE
|
CIJENA U EUR-ima
|
Prije 7.5.2018.
|
Poslije 7.5.2018.
|
Članovi MIPRO i IEEE |
180
|
200
|
Studenti (preddiplomski i diplomski studij) te nastavnici osnovnih i srednjih škola |
100
|
110
|
Ostali |
200
|
220
|
Popust se ne odnosi na studente doktorskog studija.
Kontakt:
Karolj Skala
Institut Ruđer Bošković
Centar za informatiku i računarstvo
Bijenička 54
10000 Zagreb, Hrvatska
E-mail: skala@mipro.hr
Najbolji radovi bit će nagrađeni.
Prihvaćeni radovi bit će objavljeni u zborniku radova s ISBN brojem. Radovi prezentirani na skupu bit će poslani za objavljivanje u bazi IEEE Xplore.
Autori istaknutih radova će se pozvati da dostave proširenu inačicu rada koji će se uvrstiti u razmatranje za specijalni broj časopisa Scalable Computing: Practice and Experience (ISSN 1895-1767) s planom objave u prvom kvartalu 2019.
Predsjednik Međunarodnog programskog odbora:
Karolj Skala (Croatia)
Međunarodni programski odbor:
Enis Afgan (Croatia), Slaviša Aleksić (Germany), Slavko Amon (Slovenia), Lene Andersen (Denmark), Vesna Anđelić (Croatia), Michael E. Auer (Austria), Dubravko Babić (Croatia), Snježana Babić (Croatia), Almir Badnjevic (Bosnia and Herzegovina), Marko Banek (Croatia), Mirta Baranović (Croatia), Bartosz Bebel (Poland), Ladjel Bellatreche (France), Petar Biljanović (Croatia), Eugen Brenner (Austria), Ljiljana Brkić (Croatia), Gianpiero Brunetti (Italy), Marian Bubak (Poland), Andrea Budin (Croatia), Željko Butković (Croatia), Željka Car (Croatia), Jesús Carretero Pérez (Spain), Matjaž Colnarič (Slovenia), Alfredo Cuzzocrea (Italy), Marina Čičin-Šain (Croatia), Marko Čupić (Croatia), Davor Davidović (Croatia), Marko Delimar (Croatia), Saša Dešić (Croatia), Todd Eavis (Canada), Maurizio Ferrari (Italy), Tiziana Ferrari (Netherlands), Bekim Fetaji (Macedonia), Nikola Filip Fijan (Croatia), Renato Filjar (Croatia), Tihana Galinac Grbac (Croatia), Enrico Gallinucci (Italy), Dragan Gamberger (Croatia), Paolo Garza (Italy), Liljana Gavrilovska (Macedonia), Ivan Gerlič (Slovenia), Matteo Golfarelli (Italy), Stjepan Golubić (Croatia), Montserrat Gonzales (United Kingdom), Francesco Gregoretti (Italy), Stjepan Groš (Croatia), Niko Guid (Slovenia), Jaak Henno (Estonia), Ladislav Hluchy (Slovakia), Željko Hocenski (Croatia), Vlasta Hudek (Croatia), Darko Huljenic (Croatia), Željko Hutinski (Croatia), Robert Inkret (Croatia), Mile Ivanda (Croatia), Hannu Jaakkola (Finland), Matej Janjić (Croatia), Leonardo Jelenković (Croatia), Rene Jerončić (Croatia), Dragan Jevtić (Croatia), Admela Jukan (Germany), Robert Jones (Switzerland), Peter Kacsuk (Hungary), Aneta Karaivanova (Bulgaria), Tonimir Kišasondi (Croatia), Marko Koričić (Croatia), Tomislav Kosanović (Croatia), Dieter Kranzlmüller (Germany), Marko Lacković (Croatia), Erich Leitgeb (Austria), Maria Lindén (Sweden), Dražen Lučić (Croatia), Marija Marinović (Croatia), Ludek Matyska (Czech Republic), Mladen Mauher (Croatia), Igor Mekjavic (Slovenia), Igor Mekterović (Croatia), Branko Mikac (Croatia), Veljko Milutinović (Serbia), Nikola Mišković (Croatia), Vladimir Mrvoš (Croatia), Jadranko F. Novak (Croatia), Predrag Pale (Croatia), Jesus Pardillo (Spain), Nikola Pavešić (Slovenia), Branimir Pejčinović (United States), Dana Petcu (Romania), Juraj Petrović (Croatia), Damir Pintar (Croatia), Željka Požgaj (Croatia), Slobodan Ribarić (Croatia), Janez Rozman (Slovenia), Rok Rupnik (Slovenia), Dubravko Sabolić (Croatia), Zoran Skočir (Croatia), Ivanka Sluganović (Croatia), Mario Spremić (Croatia), Vlado Sruk (Croatia), Stefano Stafisso (Italy), Uroš Stanič (Slovenia), Ninoslav Stojadinović (Serbia), Jadranka Šunde (Australia), Aleksandar Szabo (Croatia), Laszlo Szirmay-Kalos (Hungary), Davor Šarić (Croatia), Dina Šimunić (Croatia), Zoran Šimunić (Croatia), Dejan Škvorc (Croatia), Velimir Švedek (Croatia), Antonio Teixeira (Portugal), Edvard Tijan (Croatia), A Min Tjoa (Austria), Roman Trobec (Slovenia), Sergio Uran (Croatia), Tibor Vámos (Hungary), Mladen Varga (Croatia), Marijana Vidas-Bubanja (Serbia), Mihaela Vranić (Croatia), Boris Vrdoljak (Croatia), Slavomir Vukmirović (Croatia), Yingwei Wang (Canada), Mario Weber (Croatia), Roman Wyrzykowski (Poland), Damjan Zazula (Slovenia)
Mjesto održavanja:
Opatija, sa 170 godina dugom turističkom tradicijom, vodeće je ljetovalište na istočnoj strani Jadrana i jedno od najpoznatijih na Mediteranu. Ovaj grad aristokratske arhitekture i stila već 170 godina privlači svjetski poznate umjetnike, političare, kraljeve, znanstvenike, sportaše, ali i poslovne ljude, bankare, menadžere i sve kojima Opatija nudi svoje brojne sadržaje.
Opatija svojim gostima nudi brojne komforne hotele, odlične restorane, zabavne sadržaje, umjetničke festivale, vrhunske koncerte ozbiljne i zabavne glazbe, uređene plaže i brojne bazene i sve što je potrebno za ugodan boravak gostiju različitih afiniteta.
U novije doba Opatija je jedan od najpoznatijih kongresnih gradova na Mediteranu, posebno prepoznatljiva po međunarodnim ICT skupovima MIPRO koji se u njoj održavaju od 1979. godine i koji redovito okupljaju preko tisuću sudionika iz četrdesetak zemalja. Ovi skupovi Opatiju promoviraju u nezaobilazan tehnološki, poslovni, obrazovni i znanstveni centar jugoistočne Europe i Europske unije općenito.
Detaljnije informacije se mogu potražiti na www.opatija.hr i www.visitopatija.com.
|
|