Search  English (United States) Hrvatski (Hrvatska)

innovative promotional partnershipDriving the Future with Smart and Intelligent ICT

Technical co-sponsorship

 
Conferences
Opening ceremony
Forum
Workshops
Tutorials - CRO
Conferences
Exhibition

Event program
Thursday, 5/24/2012 9:00 AM - 1:00 PM,
Liburna, Hotel Admiral, Opatija
Invited Papers
 
1.L. Bellatreche (POITIERS UNIVERSITY - LISI/ENSMA, Futuroscope Chasseneuil cedex, France), S. Khouri, I. Boukhari, R. Bouchakri (LIAS/ENSMA Poitiers University, Futuroscope, France)
Using Ontologies and Requirements for Constructing and Optimizing Data Warehouses 
Developing database (DB) and data warehouse (DW) applications pass through three main phrases imposed by the ANSI/SPARC architecture: conceptual modeling, logical modeling and physical modeling. Some research efforts add a new ontological level above the conceptual one. This architecture has created two main actors whose presence is mandatory to ensure the success of applications: “conceptual designer” for conceptual and logical levels and “database administrators” (DBA) for physical level. Note that some administration tasks need some inputs from conceptual phase. Unfortunately, interaction between these two actors is negligible. Recently, some research and industrial efforts identify a highest cost of DBA and propose tools (advisors) to replace them, in order to ensure what we call zero-administration. The main limitation of these tools is their robustness. In this paper, we propose a new human resource management for database applications. Instead of replacing DBA, we claim to delegate some DBA tasks to conceptual designers. These tasks are usually those having inputs user requirements that may be translated to SQL queries. First, we propose a user make user requirements persistent into DWs. An analysis of requirements is given to identify SQL queries that may be used for physical design phase. Finally, a selection of indexes based on user requirements is presented and evaluated using star schema benchmark.
2.F. Ho, G. Lohman (IBM, San Jose, United States)
Business Analytics in (a) Blink 
The Blink project’s ambitious goal is to answer all Business Intelligence (BI) queries in mere seconds, regardless of the database size, with an extremely low total cost of ownership. Blink is a new DBMS aimed primarily at read-mostly BI query processing that exploits scale-out of commodity multi-core processors and cheap DRAM to retain a (copy of a) data mart completely in main memory. Additionally, it exploits proprietary compression technology and cache-conscious algorithms that reduce memory bandwidth consumption and allow most SQL query processing to be performed on the compressed data. Blink always scans (portions of) the data mart in parallel on all nodes, without using any indexes or materialized views, and without any query optimizer to choose among them. The Blink technology has thus far been incorporated into two IBM accelerator products generally available since March 2011.
3.T. Bronzin (CITUS d.o.o., Zagreb, Croatia), A. Stipić (IN2 d.o.o., ZAGREB, Croatia)
Business Intelligence vNext or How Cloud Computing is (not) Changing the Way We Do BI 
Cloud computing and BI (Business Intelligence) are technologies of choice to address today business needs. They allow companies to optimize IT and become more competitive and productive. Implementing BI solutions in the cloud brings both benefits and new challenges that need to be addressed. On technology side there are security, (increasingly mobile) communication infrastructure and speed. On business side there are privacy, clever use of information “push” model, compliance with different laws/regulations and general sustainability (ecology). New technology and business models are arising because data is widely distributed and BI can (and often must) use new sources of information. Examples of new sources are social networks (Facebook, Twitter …) and search engine analytics (Google, Bing …) which are actually public clouds. Combining these data sources with additional dimensions like real-time GPS data from mobile devices and access to geo-location services may significantly change the way some businesses are being run today. This paper describes the marriage of cloud and mobile computing, new area of both business and IT opportunities and models that might be used to create and implement BI using private, public and hybrid cloud solutions.
4.M. Varga, K. Ćurko (University of Zagreb, Faculty of Economics & Business, Zagreb, Croatia)
Some Aspects of Information Systems Integration 
The paper describes several increasingly important aspects of information systems integration, of which data integration is given most attention. The common techniques of data integration, such as data consolidation, data federation and data propagation are discussed. Avoiding information inconsistency in the process of data integration can improve data quality. Among the important features of an integrated information system special attention is devoted to the close-loop between the operational and analytical part that can be achieved with an active data warehouse.
Papers 
1.W. Krathu, C. Pichler, M. Zapletal, H. Werthner (Institute of Software Technology and Interactive Systems, Vienna University of Technology, Vienna, Austria)
Semantic Inter-Organizational Performance Analysis for EDIFACT Using Balanced Scorecard 
Evaluating inter-organizational relationships (IORs) is important in todays businesses for increasing competitiveness and business potential. Typically, IORs are measured by high-level key performance indicators (KPIs) such as trust, knowledge sharing, and others. However, these high-level KPIs are difficult to be measured quantitatively. It is therefore difficult to determine IORs and their outcomes since the evaluation of IORs results in a complex and ambiguous task. In this position paper, we propose a set of research questions dealing with measuring and evaluating IORs. These include the identification of KPIs from inter-organizational EDIFACT messages and the alignment of KPIs to business goals by means of the Balanced Scorecard methodology and semantic technology. Having the approach at hand supports us in lifting operational information to the business level. The alignment between KPIs and business goals allows evaluating an organizations IORs and results in two distinct advantages. First, the explicit connection between KPIs and business goals results in a justification of the relation between IORs and business achievements. Second, the evaluation of IORs against strategic goals can be achieved semi-automatically.
2.A. Korobko, V. Nicheporchuk, T. Penkova (Institute of Computational Modeling of the Siberian Branch of the Russian Academy of Science, Krasnoyarsk, Russian Federation)
Emergency Situations Monitoring Using OLAP Technology 
The original emergency situations monitoring system “Espla-M” is represented in the paper. The system “Espla-M” allows a user to perform multiaspect monitoring of natural and anthropogenic emergency situations. The aspects represent different data level and problem of emergency monitoring. Historical data processing makes the emergency preventing possible. On-line state control is immediate emergency reaction providing. Analytical data processing allows specialist to do expert estimation of both: historical and current data about emergency situations. The system “Espla-M” success is in simultaneous using of data processing and storage technologies: Data Warehouse, OLAP, GIS and Expert systems. The system “Espla-M” is implemented in Ministry of emergency of Siberian Region.
3.M. Velić, I. Padavić (Initium Futuri d.o.o., Zagreb, Croatia), Z. Lovrić (T-Hrvatski Telekom d.d., Zagreb, Croatia)
Case study: Analysis of the Direct Sales Force Performance - Clients Reach by Geographical Area in Telecommunication Industry 
In today's every-day changing economy, direct sales is of big importance both as a sales and customer satisfaction tool but also as an information retrieval process. In direct sales business there is a problem of geographical deployment of the salesmen and thus utilization of the complete clients population dispersed geographically. In this paper we analyze those parameters on the example of one Croatian telecommunication company. Problems are identified and possible improvements by utilizing some of the travelling salesmen problem solutions are analysed.
4.I. Zouari, F. Ghozzi, R. Bouaziz (Mir@cl Laboratory, Sfax, Tunisia)
Constraints to Manage Consistency in Multiversion Data Warehouse 
Defining structural constraints in multidimensional models is certainly a necessity to ensure the consistency of such models, especially the hierarchical structure of dimensions. This necessity increases when considering temporal and multiversion multidimensional models due to their temporal extension. Our proposed model for managing multiversion data warehouses (MV-DW) supports several structural and temporal constraints. In this paper, we enrich these constraints by the temporal fact-dimension dependency constraint which deals with relationship between fact and dimensions in a temporal context. Besides, schema and instance evolution operators can introduce inconsistencies in MV-DW schema and/or data. Such inconsistencies can be avoided by checking the defined MV-DW constraints. To this end, we define two classes of technical issues. The first one deals with algorithms that have to be run following to schema evolution operators to check the consistency of the changed DW schema. The second issue deals with triggers that have to cancel data propagation when detecting any instance constraint violation.
Thursday, 5/24/2012 3:00 PM - 7:00 PM,
Liburna, Hotel Admiral, Opatija
Papers 
1.M. Vranić, D. Pintar, M. Banek (Faculty of Electrical Engineering and Computing, Zagreb, Croatia)
The Impact of Threshold Parameters in Transactional Data Analysis 
Today’s information systems keep large quantities of transactional data which may contain valuable but hidden information. Association rules generation is a method developed for analysis of this tape of data. However, this method is very resource intensive. Both time of execution and final model highly depend on threshold parameters set by the analysts. In this paper we analyze the impact of minimal support parameter on the number of closed frequent itemsets discovered for various datasets. Analysis is conducted on referent datasets and real-life datasets which classify transactional elements in categories organized in hierarchical manner. Datasets that are not originally transactional can be transformed to transactional form. They exhibit somehow different relations between minimal support parameter and the number of discovered closed frequent itemsets. Findings presented in this paper can serve as guidance in setting up support as most important parameter affecting final time of execution. To analyze data characteristics from other perspectives, we also present how varying confidence, lift and support can affect number of possible association rules formation containing two elements.
2.I. Buratović, M. Miličević, K. Žubrinić (Sveučilište u Dubrovniku, Dubrovnik, Croatia)
Effects of Data Anonymization on the Data Mining Results 
Privacy preservation is an important issue in the publication of data for mining purposes. In order to discover data patterns and relationships using data mining techniques, the data must be released in the form of original tuples, instead of pre-aggregated statistics. These records usually contain sensitive and even confidential personal information, which implies significant privacy concerns regarding the disclosure of such data. Removing explicit identifiers prior to data release cannot guarantee anonymity, since the datasets still contain information that can be used for linking the released records with publicly available collections that include respondents’ identities. One of the privacy preserving techniques proposed in the literature is the k-anonymization. The process of anonymizing a data set usually involves generalizing data records and, consequently, it incurs loss of relevant information. In this article, the impact of anonymization has been measured by comparing the results of mining the original data set with the results of mining the altered data set to determine if it is possible to use anonymized data for research purposes.
3.M. Boero, M. Jackson (CSI-Piemonte, Torino, Italy)
Setting up, managing and using a complex BI platform 
CSI-Piemonte builds and manages Information Systems for public authorities in Italy in the Piedmont region. The consortium has adopted a Business Intelligence (BI) platform made up of SAS and Business Object solutions. The company has been working with SAS for 30 years and nowadays it has a Business Intelligence Competency Centre (BICC), set up in order to use and manage the platform as its optimal level; the BICC’s roots date back to the 80s. In this platform both hardware and software of all customers converge on a single high reliable infrastructure based on SAS 9.2 and Business Object XI and the SAS component is designed as a Grid Platform. On this architecture CSI-Piemonte design a wide range of BI applications to support the Piedmont local institutions. Applications cover all BI use cases, taking into account also Data Quality and Data Mining issues. The platform includes Data Integration, Data Quality, Data Mining, other Analytics, Query and Reporting and Master Data Management tools. The paper focuses on the overall experience of setting up and managing a BI platform of such complexity and on a success use case on e-government.
4.Z. Tekic, D. Kukolj (Faculty of Technical Sciences, University of Novi Sad, Novi Sad, Serbia), L. Nikolic, M. Drazic ( RT-RK, Institute for Computer Based Systems, Novi Sad, Serbia), M. Pokric, M. Vitas, D. Nemet (RT-RK, Institute for Computer Based Systems, Novi Sad, Serbia)
PSALM - Tool for Business Intelligence 
Today intellectual property (IP) is recognized as a new currency in the society, gaining importance higher than ever before. Among intellectual property rights (IPRs), patents are of particular significance. Patents are a unique source of information since they are collected, screened and published according to internationally agreed standards. They are multidimensional document which provide: technical and legal as well as business and public policy relevant information. These features offer a full spectrum of possibilities for using patent information in core areas of business intelligence: competitor monitoring, technology assessment, R&D portfolio management, the identification and assessment of potential sources for the external generation of technological knowledge, and human resource management. Main stakeholders interested in patent information are managers, entrepreneurs, researchers and inventors, and patent professionals. However, using and managing a set of patents is not easy. Ever increasing number of patents makes impossible to find and analyze relevant documents manually. Therefore, it is important to develop business intelligence tools which will easier patent portfolio analysis and enable in-depth understanding technology trends, market place and competitors. In order to make decisions about usage and management of patents it is important to understand the strengths, weaknesses, opportunities and threats related to the managed patent assets. Among the other things patent portfolio management should provide: understanding the current patent position of the company in the market; monitoring competitors; forecasting technological development; improve the company’s decision-making process: where/when to invest; and effective defensive tactics. This paper describes PSALM, recently developed tool for business intelligence. The tool is based on MySQL database and web robot, both supported by routines developed in Java and PHP. The PSALM assembles patent data from publicly available data bases (USPTO, WIPO and EPO) collects and analyses bibliographic parameters of patents but also does text mining using the term frequency – inverse document frequency weighting scheme. High-dimensional data contained in the patent documents are transformed into much lower dimensionality space (2D or 3D), maintaining the most similar structure to the original. The reduced patent data space is organized in clustered structure. The clustered patent data space can be presented and visualized with respect to various contexts such as technology areas, companies, citations, time periods etc. The PSALM functionality and usability will be demonstrated on MPEG-2 related patent portfolio and its potential for small and medium enterprises especially highlighted.
5.J. Mirković (Crnogorski Telecom, Podgorica, Montenegro), L. Kašćelan (Ekonomski fakultet, Podgorica, Montenegro)
Designing Data Mart for Managing IT Department of a Telecommunications Company 
The subject matter of the research of this paper is the design of data mart in the area of analysing problems/requests that the IT department in a telecommunications company handles / executes to operate smoothly and to ensure that the services this company offers to users are available on time. Main decision-making processes have been identified during the research in the IT department of a telecommunications company and appropriate data mart was designed. The concept of realisation of the data mart was presented using business intelligence tools. Implemented system was tested on data from the case study of a company – Crnogorski Telekom AD, where outputs with good performance were successfully generated.
6.L. Humski (FER, Zagreb, Croatia), I. Lažegić (Ericsson Nikola Tesla d.d., Zagreb, Croatia), Z. Skočir (FER, Zagreb, Croatia)
Data Warehouse for FER e-Invoice System 
Electronic business (e-business) includes business transactions and information interchange using information and communication technology in an enterprise, between enterprises and their customers or between enterprises and public administration. An e-invoice is the most widely used electronic document in the world. Although it covers only one segment of the entire supply chain, the e-invoice has a central role in the development of electronic business. On the other hand, the data warehouse is a subject-oriented, integrated, time-variant and non-volatile collection of data in support of management's decision making process. The FER e-invoice system contains business processes that should be monitored and analyzed. Therefore, a data warehouse, containing data extracted from e-invoice XML documents, has been developed.
7.M. Pighin (University of Udine/Department of Mathematics and Computer Science, Udine, Italy), A. Marzona (LiberaMente Srl, Udine, Italy)
Data Value in Decison Process: Survey on Decision Support System in Small and Medium Enterprises 
This paper describes a survey on the actual use of Business Intelligence (BI) methodologies in Small Medium Enterprises (SMEs) of Udine district, a medium industrialized area in the North East of Italy. The companies were asked to give answers to a questionnaire prepared by our staff; we got also deeper information through specific interviews. The sample was composed of 45 enterprises different for geographic area, dimension and product specialization. In the paper we describe the rationale, the knowledge target and the methodology of the survey. We review the economical and cultural contest of the district. At last we analyze the results about the effective and desired use of BI as Decision Support System (DSS). We describe business specializations, company fields and typology of enterprises where BI has higher appeal. We remark the different implementations of data-warehouses, the tools of analysis mostly used, the personal involved. In the conclusions we summarize all the result and we discuss about the difficulty of developing correct and effective BI systems in SMEs.
8.M. Ljubić, S. Skular (Multicom d.o.o., Zagreb, Croatia)
CAR (Collect-Analyze-Report) Ad hoc baza i korisničko sučelje 
This paper presents key features of the CAR reporting system, its components and main developing steps using Oracle PL/SQL (Procedural Language/Structured Query Language), Java and JavaScript. The system could best be described as advanced multi-cube OLAP engine with ability to dynamically add and modify cubes and dimensions according to users’ needs, effectively expanding or flattening the cubes. The data can be stored by user input through input forms or using ETL (Extract, Transform and Load) process. The application has a web-based user interface that allows multidimensional data models management, users and security management, input forms and reports management, bridge management, data quality management, data delivery with optional XML (Extensible Markup Language) data import for larger data sets and report execution and preview with different file format export options for analyzed data. Security module enables the administrator to set up security rules for each user or user role allowing a user to see only a subset of cubes’ dimensions and its coordinates by explicitly defining combinations of coordinates, effectively limiting the users view on the cube data. As all analytical processing is calculated only when requested, results of performance measurements with different data set sizes are given in this paper.
9.J. Pavlić, A. Bojčić (Multicom d.o.o, Zagreb, Croatia)
Sustav za analizu korištenja elektroničke pošte 
Ovaj rad se bavi područjem analize prometa elektroničke pošte u kompleksnim sustavima kao što su Postfix i Cyrus. U radu opisan je sustav koji nudi rješenje problema, zajedno sa svojim prednostima i nedostacima. Sustav nudi brzu i sistematiziranu ekstrakciju teško čitljivih i razdvojenih tekstualnih podataka, koji su generirani od strane sustava za bilježenje aktivnosti, te spremanje podataka u skladište podataka kao centralno mjesto za spremanje, čineći ih tako dostupnima za kasniju multidimenzionalnu analizu i izvještavanje. Rješenje je bazirano na besplatnom i otvorenom softveru, što reducira troškove i čini ga pogodnim za šire korištenje. Implementacija je bazirana na Oracle 10g XE DBMS, skriptnom programskom jeziku Perl i programskim rješenjima tvrtke JasperSoft.
10.D. Oreščanin (POSLOVNA INTELIGENCIJA d.o.o., ZAGREB, Croatia), J. Ostojić (Zagrebački Holding d.o.o., ZAGREB, Croatia)
Studija izvedivosti implementacije sustava za centralizirano upravljanje matičnim podacima u Zagrebačkom Holdingu 
Temeljna zadaća Zagrebačkog holdinga je obavljanje komunalnih djelatnosti uz zaštitu okoliša i javnog interesa lokalne zajednice. U svrhu optimizacije poslovnih procesa Holding razmatra mogućnost implementacije sustava za centralizirano upravljanje matičnim podatcima o klijentima, lokacijama i objektima, te proizvodima i uslugama koje nudi svojim klijentima. U radu je opisana analiza poslovnih procesa i postojeće infrastrukture, definirani su funkcionalni zahtjevi i predložena arhitektura takvog sustava.
11.K. Futivić (IN2 doo, Zagreb, Croatia)
Empirijsko istraživanje o BI u Hrvatskoj 
U okviru ovog rada empirijski su, putem anketnog upitnika poslanog u brojna poduzeća, istraženi zahtjevi koji determiniraju funkcionalnosti koje BI aplikacija treba podržavati. Te funkcionalnosti rangirane su po skupovima funkcionalnosti i pojedinačno kako bi dale jasnu sliku što korisnici BI sustava smatraju bitnim a što manje bitnim. Na taj način ovo istraživanje pokazuje što krajnji korisnici BI aplikacija smatraju važnim da aplikacija podržava, odnosno koje funkcionalnosti su bitne pa su time i važni kriteriji prilikom ocjene kvalitete aplikacije. Istraženi su posebno zahtjevi odjela kontrolinga, zadaci koje taj odjel ima u svakodnevnom radu, koji se instrumenti koriste za obavljanje tih zadataka i kako ti instrumenti definiraju funkcionalnosti BI sustava. Same funkcionalnosti ne moraju nužno biti presudne za uspješno uvođenje BI sustava u poduzeće pa je istraživanje prošireno analizom karakteristika projekta i dobavljača BI sustava, potom troškova informatičkog rješenja, mogućnosti edukacije i drugih karakteristika važnih za uspjeh projekta implementacije.

Basic information:
Chairs:

Mirta Baranović (Croatia), Matteo Golfarelli (Italy), Boris Vrdoljak (Croatia), Roberto Sandri (Croatia)

Program Committe:

Alberto Abello Gamazo (Spain), Marko Banek (Croatia), Mirta Baranović (Croatia), Ladjel Bellatreche (France), Alfredo Cuzzocrea (Italy), Todd Eavis (Canada), Dragan Gamberger (Croatia), Matteo Golfarelli (Italy), A Min Tjoa (Austria), Zoran Skočir (Croatia), Mladen Varga (Croatia), Boris Vrdoljak (Croatia), Robert Wrembel (Poland)

Registration / Fees:
REGISTRATION / FEES
Price in EUR
Before May 7, 2012
After May 7, 2012
Members of MIPRO and IEEE
180
200
Students (undergraduate), primary and secondary school teachers
100
110
Others
200
220

Contact:

Boris Vrdoljak
Faculty of Electrical Engineering and Computing
Unska 3
HR-10000 Zagreb, Croatia

Phone: +385 1 6129 756
Fax: +385 1 6129 915
E-mail: boris.vrdoljak@fer.hr

Location:

Opatija, often called the Nice of the Adriatic, is one of the most popular tourist resorts in Croatia and a place with the longest tourist tradition on the eastern part of Adriatic coast. Opatija is so attractive that at the end of the 19th and beginning of the 20th centuries it was visited by the most prominent personalities: Giacomo Puccini, Pietro Mascagni, A. P. Čehov, James Joyce, Isidora Duncan, Beniamino Gigli, Primo Carnera, Emperor Franz Joseph, German Emperor Wilhelm II, Swedish Royal Couple Oscar and Sophia, King George of Greece.

The offer includes 20-odd hotels, a large number of catering establishments, sports and recreational facilities.
For more details please look at www.opatija.hr/ and www.opatija-tourism.hr/.

 

 

Download
 
News about event
Currently there are no news
 
Patrons - random
HATZUNIPUT-HT ZagrebHEP ZagrebSveučilište u Zagrebu