Invited Speakers

  Geometrical Approach to Big Data 

Václav Snášel
    The Big Data paradigm is one of the main science and technology challenges of today. Big data includes various data sets that are too large or too complex for efficient processing and analysis using traditional as well as unconventional algorithms and tools. The challenge is to derive value from signals buried in an avalanche of noise arising from challenging data volume, flow and validity. 
    The computer science challenges are as varied as they are important. Whether searching for influential nodes in huge networks, segmenting graphs into meaningful communities, modelling uncertainties in health trends for individual patients, controlling of complex systems, linking data bases with different levels of granularity in space and time, unbiased sampling, connecting with infrastructure involving sensors, and high performance computing, answers to these questions are the key to competitiveness and leadership in this field. 
    The Big Data is usually modelled as point clouds in a high-dimensional space. One way to understand something about the data is to find a geometric object for which the data looks like a sampling of points. Then the geometric object is seen as an interpolation of the data. Main tool for studying of qualitative features of geometric objects is topology.
Topology studies only properties of geometric objects which do not depend on the chosen coordinates, distance,  but rather on intrinsic geometric properties of the objects.

    Václav Snášel is Professor of Computer Science at VŠB - Technical University of Ostrava, Czech Republic. He works s researcher and university teacher. He is Dean Faculty of Electrical Engineering and Computer Science. He is head of research programme IT4 Knowledge management at European center of excellence IT4Innovations.
    His research and development experience includes over 30 years in the Industry and Academia. He works in a multi-disciplinary environment involving artificial intelligence, social network, conceptual lattice, information retrieval, semantic web, knowledge management, data compression, machine intelligence, neural network, web intelligence, nature and Bio-inspired computing, data mining, and applied to various real world problems. 
    He has given more than 16 plenary lectures and conference tutorials in these areas. He has authored/co-authored several refereed journal/conference papers, books and book chapters. 
    He has supervised many Ph.D. students from Czech Republic, Slovak Republic, Libya, Jordan, Yemen, China and Vietnam. He supervised 20 PhD students who successfully defended PhD theses.
    He is also served as a Guest Editor of number of journals, e.g. Neurocomputing, Elsevier, Journal of Applied Logic, Elsevier etc.
    He was responsible investigator and cooperating investigator of 15 research projects in the field of basic and applied research.
    He is senior member IEEE, and he is the Chair of IEEE SMC Czechoslovak chapter.

  Statistical Learning and Queuing Models: an Hybrid Approach for Performance Evaluation

G. Rubino, INRIA, France
The ultimate goal when designing an application or a service running on the Internet is the satisfaction of the final user. For a multimedia application or service concerning the transportation of video, voice or audio content, this mainly translates into obtaining a high enough Perceptual Quality (PQ), the subjective view of the user about how much satisfactory is that content when he/she accesses it. Measuring PQ is current business in today’s industry: the definitive way of doing it is by means of a panel of human users that react to the content in a controlled experiment called a subjective test.
We have developed a technology for automatically measuring this PQ, called PSQA, so without any need of panels of human users, when the application or service is built on top of the Internet. PSQA stands for Pseudo-Subjective Quality Assessment. It is a technology developed at INRIA capable of providing a numerical (that is, quantitative) and accurate (meaning highly correlated with the results of subjective tests) estimation of the quality of a video, voice or audio flow as perceived by the end user, automatically and in real time if necessary. PSQA is based on specific statistical learning techniques that will be briefly described in the talk. Then, we will focus on a specific application of the technology. It can be combined with classical stochastic models for providing an hybrid approach leading to a new way of evaluating the performance of the considered application or service. The talk will describe the philosophy behind this idea, the general architecture, and some examples to illustrate it.

Gerardo Rubino is a senior researcher at INRIA (the French National Institute for Research in Computer Science and Control) where he is the leader of the DIONYSOS group, working on the analysis and design of networking technologies. He is a Board Member of the Media & Networks Cluster, Brittany, France, and INRIA’s representative at the SISCom Brittany Research Cluster. Among past responsibilities, he has been Scientific Delegate for the Rennes unit of INRIA for 5 years, responsible of research in networking at the Telecom Bretagne engineering school for 5 years, Associate Editor of the Operations Research international journal “Naval Research Logistics” for 9 years, former member of the Steering Board of the European Network of Excellence EuroFGI and responsible of the relationships between the network and European industry. He has also been the head of the International Partnership Office at INRIA Rennes for 5 years. He is a member of the IFIP WG 7.3. He served at the Steering Committee of QEST ( for many years. He is interested in quantitative analysis of complex systems using probabilistic models. He presently works on performance and dependability analysis, and on perceptual quality assessment of audio and video applications and services. In particular, he is the author of the PSQA (Pseudo-Subjective Quality Assessment) technology for automatic perceptual quality real-time evaluation. He also works on rare event analysis, he is a member of the Steering Committee of RESIM, the only workshop dedicated to the topic, and co-author of the book entitled “Rare Event Simulation Using Monte Carlo Methods” (Wiley, 2009). He is the author of more than 200 scientific works in applied mathematics and computer science.