SCIENTIFIC PROGRAMS AND ACTIVITIES
|September 22, 2014|
Program and Abstracts
Tuesday, February 1, 2000
11:00-11:30 - Opening address
Claudine Simson, V.P., Disruptive Technology, Network and Business Solutions, Nortel Networks
11:30-12:30 - Modern data analysis and its application to Nortel Networks data
Otakar Fojt, The University of York
In this talk we outline an approach to the analysis of sequential manufacturing and telecom traffic data from industry using techniques from nonlinear dynamics. The aim of the talk is to show the potential of nonlinear techniques for processing real world data and developing new advanced methods of commercial data analysis.
The basic idea is to consider a factory as a dynamical system. A process in the factory generates data, which contains information about the state of the system. If it is possible to analyse this data in such a way that knowledge of the system is increased, control and decision-making processes can be improved. This will result, if applied, in a basis of competitive advantage to the factory.
First, we give details of the general idea and the type of recorded data together with the necessary preprocessing techniques. We follow this with a description of our analysis. Our approach consists of state space reconstruction, applications of principal component analysis and nonlinear deterministic prediction algorithms. The talk will conclude with our results and with suggestions for future work.
1:30-2:00 - The need for real-time data analysis in telecommunications
Chris Hobbs, Sr. Mgr., System Architecture, Nortel Networks
A telecommunications network typically comprises many independently-controlled layers: from the physical fibre interconnectivity, through wavelengths, STS connexions, ATM Virtual Channels, MPLS Paths to the end-to-end connexions established for user services. Each of these layers generates statistics that, in a large network, may easily be measured in tens of gigaBytes per hour.
Traditionally, the layers have been controlled individually since the complexity of "tuning" a lower layer to the traffic it is carrying has been too great for human operators (particularly where the carried traffic itself has complex statistics) and since the work involved in moving connexions (particularly fibres and wavelengths) has been prohibitive.
Technological advances in Optical Switches, capable of logically relaying fibre or wavelengths in micro-seconds, have made flexible network rebalancing possible and Carriers, the owners of these large networks, are demanding lower costs by combining layers and exploiting this new agility. In order to address this problem, the Terabytes of data being extracted daily from the large networks need to be analysed: initially statically to determine the gross inter-related behaviours, and then dynamically to detect and react to changing traffic patterns.
2:30-3:30 - Noise reduction for human speech using chaos-like features
Holger Kantz, Max-Planck-Institut für Physik komplexer Systeme
A local projective noise reduction scheme, originally developed for low-dimensional stationary signals, is successfully applied to human speech. This is possible by exploiting properties of the speech signal which mimic structure exhibited by deterministic chaotic systems. In high-dimensional embedding spaces, the strong non-stationarity is resolved as a sequence of different dynamical regimes of moderate complexity. This filtering technique does not make use of the spectral contents of the signal and is far superior to the Ephraim-Malah adaptive filter.
4:00-5:00 - Scaling phenomena in telecommunications
Murad Taqqu, Boston University (Lecture co-sponsored by Dept. of Statistics, University of Toronto)
Ethernet local area network traffic appears to be approximately statistically self-similar. This discovery, made about eight years ago, has had a profound impact on the field. I will try to explain what statistical self-similarity means and how it is detected. I will also indicate how its presence can be explained physically, by aggregating a large number of "on-off" renewal processes, whose distributions are heavy-tailed. As the size of the aggregation becomes large, then, after rescaling, the behavior turns out to be the Gaussian self-similar process called fractional Brownian motion. If, however, the rewards instead of being 0 and 1 are heavy-tailed as well, then the limit is a stable non-Gaussian process with infinite variance and dependent increments. Since linear fractional stable motion is the stable counterpart of the Gaussian fractional Brownian motion, a natural conjecture is that the limit process is linear fractional stable motion. This conjecture, it turns out, is false. The limit is a new type of infinite variance self-similar process.
Back to Top
Wednesday, February 2, 2000
9:30-10:30 - Electrical/Biological networks of nonlinear neurons
Henry Abarbanel, Institute for Nonlinear Science at USCD, San Diego
Using analysis tools for time series from nonlinear sources, we have been able to characterize the chaotic oscillations of individual neurons in a small biological network that controls simple behavior in an invertebrate. Using these characteristics, we have built computer simulations and simple analog electronic circuits, which reproduce the biological oscillations. We have performed experiments in which biological neurons are replaced by the electronic neurons retaining the functional behavior of the biological circuits. We will describe the nonlinear analysis tools (widely applicable), the electronic neurons, and the experiments on neural transplants.
11:00-11:30 - E-commerce and data mining challenges
Weidong Kou, IBM Centre for Advanced Studies
E-commerce over Internet is having a profound impact on the global economy. Goldman, Sachs & Co. estimates B2B e-commerce revenue alone will grow to $1.5 trillion (US) over the next five years. Electronic commerce is becoming a major channel for conducting business, with increasing number organizations developing, deploying and installing e-commerce products, applications and solutions.
With rapid e-commerce growth, there are many challenges, for example, how to analyze e-commerce data and provide an organization with meaningful information to improve their product and services offering to target customers, and how to group millions web users who access a web site so that the organization can serve each group of users better and can reduce the business cost and increase the revenue. These challenges would bring a lot of opportunities for data mining researchers to develop better intelligent algorithms and systems to solve the practical e-commerce problems. In this talk, we will use IBM Net.Commerce as example to explain the e-commerce development and challenges that we face today.
11:30-12:00 - Occurrence of ill-defined probability distribution in real-world data
John Hudson, Advisor, Radio Technology, Nortel Networks
In many communications problems the statistics of the data, communication channels, and behaviour of users is ill defined and not handled well by the simpler concepts in classical probability theory. We can have data with alpha-stable (infinite variance) characteristics, long-tailed and large variance log normal distributions, self similarity in the time domain, and so on. If the higher moments of the underlying distributions do not exist or have disproportionate values then laws of large numbers and the central limit theorem may not be safely applied to a surprising number of problems. The behaviour of some control mechanisms can begin to take on a chaotic appearance when driven by such data.
In this talk, some of the properties of data, channels and systems that are confronting workers in the communication field are discussed. It is illustrated with examples taken from network data traffic, Internet browsing, radio propagation, video images, speech statistics and so on.
1:30-2:30 - The analysis of experimental time series
Tom Mullin, The University of Manchester
We will discuss the application of modern dynamical systems time series analysis methods to data from experimental systems. These will include vibrating beams, nonlinear oscillators and physiological measures. The emphasis will be placed on obtaining quantitative estimates of the essential dynamics. We will also describe the application of data synergy methods to multivariate data.
2:30-3:00 - Fuzzy-pharmacology: Rationale and applications
Beth Sproule, Faculty of Pharmacy and Department of Psychiatry Psychopharmacology, SunnyBrook Health Sciences Centre, Toronto
Pharmacological investigations are undertaken in order to optimize the use of medications. The complexity and variability associated with biological data has prompted our explorations into the use fuzzy logic for modeling pharmacological systems. Fuzzy logic approaches have been used in other areas of medicine (e.g., imaging technologies, control of biomedical devices, decision support systems), however, their uses in pharmacology are incipient. The results of our preliminary studies will be presented in which we assessed the feasibility of fuzzy logic: a) to predict serum lithium concentrations in elderly patients; and b) to predict the response of alcohol dependent patients to citalopram in attempting to reduce their drinking. Since then many current projects have evolved. Approaches to this line of investigation will be presented.
3:30-4:30 - Geospatial backbones of environmental monitoring programs: the challenges of timely data acquisition, processing and visualization
Chad P. Gubala, Director, The Scientific Assessment Technologies Laboratory University of Toronto
When considering ‘environmental’ issues or legalities, a general and useful description of a pollutant is an element or entity in the wrong place at the wrong time and perhaps in the wrong amount. Prior to the establishment of cost-effective global positioning, monitoring the fate and transport of environmental pollutants was limited to reduced scale and statistically based sampling programs. Whole systems models developed from parcels of environmental studies have been limited in predictive capability due to unnoticed attributes, undocumented synergies or antagonisms and un-quantifiable spatial and temporal variances.
Advances in the areas of commercial geospatial technologies and high-speed sensors arrays have now offered the possibility of assessing a whole ecosystem in near real time and in a spatially complete manner. This capacity should then greatly improve quantitative environmental modeling and the adaptive management process, further ‘tuning’ the balance between global environments and economies. However, the promise of increased knowledge about our natural resources is now limited by our capacity to move the data collected from integrated geopositioning and sensor systems into meaningful management products. This talk describes these limitations and addresses the needs for developments in the areas of real time analytical protocols.
4:30-5:00 - Data mining and its challenges in the banking industry
Chen Wei Xu, Manager, Statistical Modeling, Customer Knowledge Management, Bank of Montreal
Back to Top
February 3, 2000
1) Standard preprocessing
stuff (treating outliers, missing values, standardization.) -->
Friday, February 4, 2000
9:30-10:30 - Interdisciplinary application of time series methods inspired by chaos theory
Thomas Schreiber, University of Wuppertal
We report on real world applications of time series methods developed on the basis of the theory of deterministic chaos. First, we demonstrate statistical criteria for the necessity of a nonlinear approach. Nonlinear processes are not in general purely deterministic. Then we discuss modified methods that can cope with noise and nonstationarities. In particular, we will discuss nonlinear filtering, signal classification, and the detection of nonlinear coherence between processes.
11:00-12:00 - Symbolic data compression concepts for analyzing experimental data
Matt Kennel, Institute for Nonlinear Science at USCD, San Diego
1:00-2:00 - Geometric time series analysis
Mark Muldoon, University of Science and Technology in Manchester
A discussion of a circle of techniques, all developed within the last 20 years and all loosely organized around the idea that one can extract detailed information about a dynamical system (say, the equations of motion governing some industrial process...) by forming vectors out of successive entries in a time series of measurements.
2:30-3:30 - Chaotic communication using optical and wireless devices
Henry Abarbanel, Institute for Nonlinear Science at USCD, San Diego
3:30-4:30 - Status of cosmic microwave background data analysis: motivations and methods
Simon Prunet, CITA (Canadian Institute for Theoretical Astrophysics), University of Toronto
After a brief review of the physics that motivates measurements of Cosmic Microwave Background anisotropies, I will present the current observational status, the analysis methods used so far, and the challenge posed by the upcoming huge data sets from future satellite experiments.