CODATA logo
CODATA 2002: Frontiers of
Scientific and Technical Data

Montréal, Canada — 29 September - 3 October
 

Earth and Environmental Data Abstracts

Proceedings
Table of Contents

Keynote Speakers

Invited Cross-Cutting Themes

CODATA 2015

Physical Science Data

Biological Science Data

Earth and Environmental Data

Medical and Health Data

Behavioral and Social Science Data

Informatics and Technology

Data Science

Data Policy

Technical Demonstrations

Large Data Projects

Poster Sessions

Public Lectures

Program at a Glance

Detailed Program

List of Participants
[PDF File]

(To view PDF files, you must have Adobe Acrobat Reader.
)

Conference Sponsors

About the CODATA 2002 Conference

 


Track I-C-3:
Frameworks for Sharing Geographic Data

Chair: Michael Goodchild, National Center for Geographic Information and Analysis and University of California, USA

This session reviews emerging technological and institutional models for widespread sharing of geographic data within and among large numbers of scientists and other users of geographic information. The frameworks described are complementary to each other. Individually and together they will facilitate expanded access and ease of use of geographic data across diverse and numerous scientific disciplines.

Among the framework initiatives to be addressed include:

  1. The National Map,
  2. Geospatial One-Stop,
  3. The Geography Network, and
  4. Frameworks for Sustainability of GIS Development in Low Income Countries.

From a U.S. perspective, the first three of these initiatives are all being developed within the standards and interoperability context of the U.S. National Spatial Data Infrastructure (NSDI). From a global perspective, these spatial database sharing efforts as well as those from many other nations are being developed within the context of the Global Spatial Data Infrastructure (GSDI) initiative.

1. Frameworks for Sustainability of GIS Development in Low Income Countries
Gilberto Camara, Director of Earth Observation, INPE, Brazil

This presentation discusses the development of Geographic Information System (GIS) software and technological approaches pursued in Brazil. Issues encountered in sustaining a complex technology in a large low income country (LIC) are outlined. In the process of describing the Brazilian experience, the prevalent assumption that LICs do not possess the complex technical and human resources required to develop and support GIS and similar technologies is challenged. Challenges, benefits and drawbacks of developing GIS software capabilities locally are examined and a number of important applications where local technology development has contributed to better understanding and cost-effective solutions are highlighted. Finally, some of the potential long-term benefits of a "learning-by-doing" approach and how other countries might benefit from the Brazilian experience are discussed.

 

2. The Geography Network
Clint Brown, ESRI, USA

Many now see the Internet as the most effective means of meeting the accelerating demand for geographically referenced information. Launched by ESRI in June, 2000, with the support of the National Geographic Society and many data publishers (EarthSat, GDT, WRI, US EPA, Tele Atlas, Space Imaging, etc.) the Geography Network <www.geographynetwork.com>, is a global collaborative and multi-participant network of geographic information users and providers including government agencies, commercial organizations, data publishers, and service providers, who use the Internet to share, publish, and use geographically referenced information. The Geography Network can be thought of as a large online library of distributed GIS information available to everyone. Users consult the Geography Network catalog, a searchable index of all information and services available to Geography Network users. A wide spectrum of simple to advanced GIS and visualization software technologies and online tools allow defining areas of interest, searching for specific geographic content, and can guide users to mapping services. Using any Internet browser, they access data that are physically located on servers around the globe, and can connect one or more sites at the same time. They can use digital map overlay and visualization, and combine and analyze many types of data from different sources. These data can be provided immediately to browsers or to desktop GIS software. Thousands of data layers are already available and Geography Network content is constantly increasing. Much of the content is accessible for free. Commercial content is also provided and maintained by its owners. Viewing or downloading of commercial content, or using commercial services, is charged in the Geography Network's e-commerce system. Becoming a provider is free and simple to do. The Geography Network uses open GIS standards and communication protocols, and serves as a test bed for data providers and the Open GIS Consortium. This presentation will show how the system works, explain the facilities provided, indicate the range of providers, describe the genesis of the system and its progress, and discuss future plans and directions.

 

3. Geospatial Information One-Stop
M. Robinson, Federal Geographic Data Committee, USA

The Geospatial One-Stop is part of a Presidential Initiative to improve effectiveness, efficiency, and customer service throughout the U.S. Federal Government. It builds upon the National Spatial Data Infrastructure (NSDI) and will accelerate its development and implementation. Geospatial One-Stop is classified as a Government-to-Government (G2G) project because it will focus on sharing and integrating Federal, State, local, and tribal data, and enable more effective management of government business. The vision is to spatially enable the delivery of government services.

The goals of Geospatial Information One Stop include providing fast, low-cost reliable access to Geospatial Data for government operations, facilitating G2G interactions needed for vertical missions such as Homeland Security, supporting the alignment of roles, responsibilities and resources, and establishing a methodology for obtaining multi-sector input for coordinating, developing and implementing geographic (data and service) information standards to create the consistency needed for interoperability and to stimulate market development of tools

The five major tasks identified in the Project Plan are: 1. Develop and implement data standards for NSDI Framework Data. 2. Fulfill and maintain an operational inventory (based on standardized documentation, using FGDC Metadata Standard) of NSDI Framework Data from Federal agencies, and publish the metadata records in the NSDI Clearinghouse network. 3. Publish metadata of planned acquisition and update activities for NSDI Framework Data from Federal agencies in the NSDI Clearinghouse network. 4. Prototype and deploy data access and web mapping services for NSDI Framework Data from Federal agencies. 5. Establish a comprehensive Federal portal to the resources described in the first four components (standards, priority data, planning information, and products and services), as a logical extension to the NSDI Clearinghouse network.

 

4. The National Map - Sharing Geospatial Data in the 21st Century
Barbara J. Ryan, U.S. Geological Survey, Reston, Virginia, USA

Over the last century, the United States has invested on the order of $1.6 billion and 33 million person hours in the standard (1:24,000 scale) topographic map series. These maps and associated digital data are the country's most extensive geospatial data infrastructure. They are also the only coast-to-coast, border-to-border coverage of our Nation's critical infrastructure - highways, bridges, dams, power plants, airports, etc. It is, however, an asset that is becoming increasingly outdated. These maps range in age from one year, those that were updated last year, to 57 years, those that have never been updated. The average age of these 55,000 maps is 23 years.

In January 2001, the Department of Interior's U.S. Geological Survey (USGS) undertook a decadal effort to transform the largely paper series to an online, seamless, integrated database known as The National Map. Extensive partnerships with local and State governments, other federal agencies, non-governmental organizations, universities and the private sector are being forged to construct The National Map. It is not a just a "federal" map, it is a "national" map -- an important distinction allowing greater leveraging of limited resources in order to fulfill the geospatial community's goal of "collect once, use many times."

These maps and related data touch, if not underpin, many sectors of the economy including the housing and development industry, agriculture, transportation, recreation, and emergency preparedness. After September 11th, the USGS provided more than 120,000 maps, hundreds of Landsat images and digital data files to assist with disaster planning, prevention, mitigation, and response efforts conducted at the local, State, and federal levels.

Coordination and standards-development mechanisms like the President's Geospatial One-Stop initiative, the Federal Geographic Data Committee, the Office of Management and Budget Circular A-16, and State-based geographic information consortia both advance and strengthen the policy framework for sharing geospatial data and other information assets of governments. The National Map, much like topographic maps in the last century, is a physical manifestation, in fact a visualization of this policy framework.



Track IV-A-6:
Application 2D et 3D de systèmes SIG. Transopérabilité de gestion intégrée de bases à composantes cartographiques
(2D and 3D Applications of GIS Systems: Interoperability of Integrated Cartographic Database Management)


Chairs: Jacques Segoufin, Institut de Physique du Globe, Paris, France et
Alexei Novikov, National Technical University of Ukraine, Kiev Polytechnic Institute, Ukraine

Selon les pays on se base, pour assurer la projection des éléments de carte sur le plan des données de surface courbe de notre planète, sur le choix de systèmes de références.

Le choix d'un ellipsoïde de référence est variable et diverses recommandations sont disponibles. De la nature des types de projection dépendent également la qualité des correspondances entre cartes mondiales, régionales et locales.

La situation des données acquises à jour sont de plus en plus effectuée dans le système UTM84.

Quelques projets européens visent à assurer les passages et de transferts plus faciles des " données techniques " locales pour les retraduire dans UTM84.

Pour la présente session il est proposé d'aborder en particulier les problèmes suivants :

  • Aspect théoriques, problèmes de référentiels et de projection
  • Réajustement des données issues de grilles différentes
  • Intégration multiparamètres
  • Organisation en réseaux des éléments d'un SIG
  • Liaison des domaines continentaux océaniques
  • Etats des grands projets internationaux (UNESCO, IGN, Geological Surveys)

1. Application of methods of space-distributed systems modeling in ecology
M. Zgurovsky, A. Novikov, National Technical University of Ukraine, Kiev Polytechnic Institute

A review of the studies carried out at NTUU"KPI" and the Institute of Cybernetics of National Academy of Sciences of Ukraine is presented. Two-dimensional and three-dimensional equations of diffusion and heat - mass transfer are used as mathematical models. The models make it possible to take account of space distribution, structural non-uniformity and anomaly properties of physical processes of harmful impurities spreading in the atmosphere, open water pools and subsoil waters.

The considered processes are characterized by substantial distribution in space. Therefore, efficient methods of numerical solution of two- and three-dimensional model equations are presented.

The complexes of programs allowing to solve efficiently the problems of modeling, prognosis and estimation of ecological processes in various environment are given.

 

2. Une mission géographique et ethnopharmacologique sur les plantes toxiques de l'Ile Maurice
A. Fakim-Gurib, Université de l'Ile Maurice
P. van Brandt, Université catholique de Louvain Bruxelles Belgique

La région sud ouest de l'Océan Indien est une zone géographique privilégiée pour sa diversité biologique et elle est bien connue pour sa flore d'espèces endémiques. A l'île Maurice l'usage traditionnel est très commun mais beaucoup de plantes utilisées peuvent apporter des risques potentiels pour la santé.

Bien que certaines plantes soient connues pour contenir un grand nombre de composés biologiquement actifs aux nombreux effets bénéfiques pour l'homme et les animaux, certains de ces mêmes éléments devraient être sujet à des dosages car à cause de leur utilisation abusive, ils se sont avérés extrêmement toxiques et provoquent ainsi des effets néfastes à la santé. L'apparition de ces effets négatifs peuvent être très soudains ou prendre du temps pour se développer. Heureusement, il n'y a que relativement peu de plantes qui, lorsqu'elles sont ingérées causent des troubles dangereux pour l'organisme. Cependant des précautions préliminaires doivent être prises pour éviter des empoisonnements, en particulier chez les jeunes enfants. Par conséquent, une mission a été menée à l'île Maurice pour identifier ces plantes toxiques. Soixante neuf espèces ont été inventoriées comme étant potentiellement toxiques et comprennent des espèces depuis : Thevetia peruviana (Apocynacaeae) considérée comme extrêmement toxique, jusqu'à Dieffenbachia seguine (Araceae) considérée comme modérément à faiblement toxique. Il est à remarquer que deux plantes indigènes endémiques de la région sont aussi considérées comme ayant des propriétés toxiques : Cnestis glabra (Connaraceae) et Agauria salicifolia (Ericaceae).

Les résultats de cette mission illustrent les degrés différents de toxicité, les composants chimiques et leur effets.

Les effets bénéfiques à long terme de ces toxines ne doivent pas être sous estimés si l'on prend l'exemple de Taxus brevifolia qui a donné entre autre naissance au fameux Taxol.

Un autre aspect, qui mérite d'être pris en compte est le fait que le climat et les facteurs environnementaux ont une influence directe sur la phytochimie de la bio diversité florale locale.

 

3. Carte structurale de l'océan indien
J. Segoufin, Institut de Physique du Globe de Paris, France

Dans le cadre des activités de la CCGM (Commission de la Carte Géologique du Monde), sous la supervision de l'UNESCO, il a été décidé de créer un certain nombre de cartes géologiques, tectoniques, structurales englobant le domaine maritime pour lequel il y a maintenant beaucoup d'informations, la Commission pour la cartographie des fonds sous marins étant en charge de ce dernier domaine;

C'est ainsi, qu'il y a deux ans, il a été décidé d'éditer une carte structurale de l'océan indien.

Cette carte a pour but de faire la synthèse des connaissances sur cet océan, d'en montrer sa formation et son évolution à partir des données géophysiques recueillies par différents instituts. Cette carte a un but pédagogique de diffusion des connaissances et doit être diffusée dans les lycées, Collèges et Universités.

Après plusieurs essais, les limites géographiques de la carte ont été fixées de 0° à 155° E et de -71° S à 30° N. La carte sera éditée dans le système de projection de Mercator à une échelle de 1/10.000.000 ème

La carte structurale de l'océan indien est constituée de 4 feuilles :

Feuille1 : 80° E
  -30° S 30° N
Feuille2 : 80° E 155° E
  -30° S 30° N
Feuille3 0° 80° E 80° E
  -71° S -30° S
Feuille4 80° E 155° E
  -71° S -30° S

L'ensemble des donnée qui sont accessibles actuellement figurera sur cette carte :
les courbes bathymétriques, les anomalies magnétiques, l'âge de la croûte océanique, tiré des anomalies magnétiques, les épicentres des seismes, divisés en deux classes : magnitudes supérieures à 6 et magnitudes inférieures à 6, les failles transformantes et zones de fracture, l'épaisseur sédimentaire, les zones de subduction, les axes de dorsale, les volcans actifs, les astroblèmes, les sites de forages DSDP et ODP ayant atteints la croûte océanique, les monts et plateaux sous-marins, etc…

En complément de cette carte, il a été également décidé de sortir une feuille physiographique, calculée à partir de la grille de Sandwell et al. et couvrant l'ensemble de l'océan indien en une seule feuille.

La plus grande partie des difficultés rencontrées dans la création de la carte structurale, consiste à rendre cohérentes, dans un même format des données qui proviennent d'horizons divers, nécessistant la plupart du temps un traitement préalable.

Actuellement les feuilles 1, 2, 3 sont terminées, la feuille 4 est en cours.

Une présentation de l'ensemble de la carte est prévue à la réunion de l'EUG à Nice en Avril 2003.

Le but de ce travail est de diffuser l'état actuel des connaissances sur l'océan indien grâce à cette série de cartes, mais également d'un support informatique interactif (CDROM), sur lequel les différentes informations apparaîtront sous forme de couches superposées pouvant être ajoutées ou supprimées à la demande.

La fin de ces travaux est programmé pour 2004, date à laquelle les versions papier et digitale de la Carte Structurale de l'Océan Indien seront présentées à la réunion de l'IGC à Florence.

 

4. Passerelle d'information sur les collections, spécimens et observations biologiques (ICSOB)
Guy Baillargeon, Agriculture and Agri-Food Canada

La Passerelle ICSOB est un prototype de moteur de recherche et de cartographie spécialisé sur les données d'observation et les spécimens biologiques des collections d'histoire naturelle. ICSOB répertorie les données disponibles par l'intermédiaire de réseaux de biodiversité accessibles sur l'Internet par voie de requêtes distribuées tels que l'Analyste d'espèces (TSA), le Réseau mondial d'information sur la biodiversité (REMIB) ou le Réseau européen d'information sur les spécimens d'histoire naturelle (ENHSIN). De façon analogue aux moteurs de recherche (tels que Google ou Altavista) qui aident à localiser des documents hypertextes, ICSOB récolte des noms dans les collections distribuées sur les réseaux de l'Internet et connecte les usagers directement aux sources de données originales. Les enregistrements de données transitent directement des gestionnaires autorisés de données primaires aux usagers finaux en temps réel. En outre, les enregistrements pourvus de coordonnées géographiques (longitude, latitude) sont reportés dynamiquement sur une carte du monde dont chacun des points de distribution est directement relié aux données originales. La Passerelle ICSOB fourni un point d'accès à des millions d'enregistrements individuels en provenance de plusieurs réseaux de biodiversité distincts. ICSOB est pleinement intégré à la version multilingue du Système d'information taxonomique intégré (SITI) facilitant l'accès aux données soit par l'intermédiaire de noms communs, de noms scientifiques ou de synonymes.

 


Track III-C-3:
Earth and Environmental Data


Chair: Liu Chuang, Chinese Academy of Sciences, Beijing, China

1. Interactive Information System for Irrigation Management
Md Shahriar Pervez, International Water Management Institute, Sri Lanka
Mohammad Ahmadul Hoque, Surface Water Modelling Centre, Bangladesh

Irrigation management is a key to efficient and timely water distribution in canal command areas keeping in view the crop factors, and for irrigation management adequate and always updated information regarding the irrigation system is needed. This paper illustrates a GIS Tool for Irrigation Management which provides information interactively for decision making process. This Interactive Information System (IIS) has been developed to facilitate the operation and management of the command area development and to calculate the irrigation efficiency in the field level. At the basis of this development is geographic information systems (GIS) but gradually, this is being adapted to the kind of decision and management functions that lie at the heart of the planning process of any irrigation project. It also provides support to the design engineers to assess the impact of the design parameters of the System. This is an Arcview based GIS tool developed with the Avenue Codes by integrating the GIS and Relational Database Management System (RDMS). Effective integration of GIS with RDMS enhances performance evaluation and diagnostic analysis capabilities. For this application real time topographic data are required which stored as spatially distributed datasets, back end RDMS has been used to store related attribute information, it lets an Irrigation manager to do some real time calculation and analysis which covers

a) Drawing of Detailed Canal and Drainage system on the basis of their category along with other spatial layers
b) Cross section profile of the canal
c) Comparison of cross sections
d) Long profile of the canal
e) Cut and Fill calculation of a Cross section in respect of the designed cross section of that particular section
f) Convince Calculation of a Particular Section
g) Calculate Area elevation curve for command area or any drawn area
h) Affected areas for the failure of any irrigation structure
I) Retrieval of current Irrigation Structure's Information along with image
j) calculate the efficiency of the system.

Easy updating system of the associated database keeps the system always updated in respect of the real field situation. A very good user friendly Graphical User Interface at the front end helps the manager to operate the application easily. Using these "point on click" functions of this application an irrigation manager is capable to generate outputs in the form of Maps, Tables and Graphs which guide him to take prompt and appropriate decision with in few minutes.

 

2. Results of a Workshop on Scientific Data for Decision Making Toward Sustainable Development: Senegal River Basin Case Study
Paul F. Uhlir, U.S. National Committee for CODATA, National Research Council, USA
Abdoulaye Gaye, Senegalese National Committee for CODATA, Senegal
Julie Esanu, U.S. National Committee for CODATA, National Research Council, USA

Scientific databases relating to the environment, natural resources, and public health on the African continent are, for various reasons, difficult to create and manage effectively. Yet the creation of these and other types of databases-and their subsequent use to produce new information and knowledge for decision-makers-is essential to advancing scientific and technical progress in that region and to its sustainable development. The U.S. National Committee for CODATA collaborated with the Senegalese National CODATA Committee to convene a "Workshop on Scientific Data for Decision-Making Toward Sustainable Development: Senegal River Basin Case Study," which was be held on 11-15 March 2002, in Dakar, Senegal. The workshop examined multidisciplinary data sources and data handling in the West Africa region, using the Senegal River Basin as a case study, to determine how these data are or can be better used in decision making related to sustainable development. This presentation provides an overview of the workshop results and a summary of the published report.



3. Study on Spatial Databases of Chinese Ecosystems
Yue Yan-zhen, Chinese Academy of Sciences, China

The spatial databases construction of Chinese ecosystems is based on Chinese Ecosystem Research Network (CERN), Chinese Academy of Sciences (CAS). In order to meet the challenges of understanding and solving the issues of resources and environment at the regional or other larger scales, and with the support of Chinese Academy of Sciences, CERN started to be constructed in 1988. CERN consists of 35 ecological stations on agriculture, forest, grassland, lake and bay ecosystems, which produce a lot of data by monitoring and measurement every day. The quality of these data is control by 5 sub-centers of CERN, including water, soil, atmosphere, biological and aquatic sub-center. At last, all these enormous calibrated data including spatial data are collected in synthesis center.

We constructed the spatial databases to connect the enormous monitoring data with ecological spatial information. This study of the spatial databases includes:
1. Standard of spatial data classification
2. Structure of spatial databases
3. Functions of special databases
4. Management of special database
5. serving of net share
6. policy of data share

Key words: ecosystem network; Geographic Information System; Data Share

 

4. Development of the Global Map: National and Cross-National Coordination
Robert A. O'Neil, Natural Resources Canada, Ottawa, Canada

The Global Map is geospatial framework data of the Earth's land areas. This framework will be used to place environmental, economic and social data in its geographic context. The Global Map concept permits individual countries to determine how they will be represented in a global data base consisting of 8 layers of standardized data: administrative boundaries, drainage, transportation, population centres, elevation, land cover, land use and vegetation cover at a data density suitable for presentation at a scale of 1:1M. Usually it is the national mapping organizations that contribute data of their country to the Global Map, which is then made available at marginal or no cost.

At present, 94 nations have agreed to contribute information to the Global Map and an additional 42 are considering their participation. To date, coverage has been completed and is available for 11 countries.

While there is a wealth of source data available for this undertaking, not all nations have the capacity to evaluate the source data sets, make corrections and transform them into a contribution to the Global Map. A proposal to relax the specifications in order to hasten the completion of the Global Map will have to be balanced with the problems of dealing with heterogeneous databases, particularly in the integration, analysis and modeling.



Track III-D-4:
The Use of Artificial Intelligence and Telematics in Environmental and Earth Sciences

Jacques-Octave Dubois, France and
Alexei Gvishiani, Russia

New tools such as artificial intelligence algorithms are needed to effectively manage and process the vast amounts of environmental and earth science data.

Given that databases are increasingly widespread, telematics techniques (computer-based and telecommunications techniques) are needed to handle algorithms. In other words, to process this considerable amount of information, clustering algorithms must be adapted for and applied in computer networks.

The two books (Editions Codata, Springer) published by the two Co-chairs will be showcased at this session.

1. Application de l'Intelligence Artificielle et Télématique dans les Sciences de la Terre et de l'Environnement
Jacques-Octave Dubois, France
Alexei Gvishiani, Russia

Presentation of the the book : Artificial Intelligence and Dynamic Systems in Geophysical Applications. By A. Gvishiani and J.O. Dubois , Schmidt United Institute of Physics of the Earth RAS, CGDS and Institut de Physique du Globe de Paris.

This volume is the second of a two-volume series written by A. Gvishiani and J.O. Dubois.
The series presents the application of new artificial intelligence and dynamic systems techniques to geophysical data acqusition, management and studies. Most of the mathematical models, algorithms and tools presented were developed by the authors. The first volume of the series, published in 1998, is entitled "Dynamical Systems and Dynamic Classification Problems in Geophysical Applications." It is devoted to the application of dynamic systems, pattern recognition and finite vector classification with learning to a variety of geophysical problems.

The book "Artificial Intelligence" introduces geometrical clustering and fuzzy logic approaches to geophysical data analysis. A significant part of the volume is devoted to applying the artificial intelligence techniques introduced in volumes 1 and 2, to fields such as seismology, geodynamics, geoelectricity, geomagnetism, aeromagnetics, topography and bathymetry.

As in the first volume, this volume consists of two parts, describing complementary approaches to the analysis of natural systems. The first part, written by A. Gvishiani, deals with new ideas and methods in geometrical clustering and the fuzzy logic approach to geophysical data classification. It lays out the mathematical theory and formalized algorithms that form the basis for classification and clustering of the vector objects under consideration. It lays the foundation for the second part of this book which is the use of this classification in the study of dynamical systems.

The second part, written by J.O. Dubois, is concerned with various theoretical tools and their applications to modeling of natural systems using large geophysical data sets. Fractals and dynamic systems are used to analyse geomorphological (continental and marine), hydrological, bathymetrical, gravimetrical, seismological, geomagnetical and volcanological data.
In these applications chaos theory and the concept of self-organized criticality are used to describe the evolution of dynamic systems.

The first volume is devoted to the mathematical and algorithmical basis of the proposed artificial intelligence techniques; this volume presents a wide range of applications of those techniques to geophysical data processing and research problems. At the same time it presents a reader with another algorithmic approach based on fuzzy logic and geometrical illumination models.

Many readers will be interested in the two volumes (vol.1, J.O. Dubois, A. Gvishiani "Dynamic Systems and Dynamic Classification Problems in Geophysical Applications" and the present vol.2, A. Gvishiani, J.O. Dubois "Artificial Intelligence and Dynamic Systems in Geophysical Applications") as a package.

 

2. The Environmental Scenario Generator (ESG) a Distributed Environmental Data Mining Tool
Eric A. Kihn, NOAA/NGDC, Boulder, CO, USA
Dr. Mikhail Zhizhin, RAS/CGDS, Moscow, Russia

The Environmental Scenario Generator (ESG) is a network distributed software system designed to allow a user running a simulation to intelligently access distributed environmental data archives for inclusion and integration with model runs. The ESG is built to solve several key problems for the modeler. The first is to provide access to an intelligent ?data mining? tool so that key environmental data can not only be retrieved and visualized but in addition, user defined conditions can be searched for and discovered. As an example, a user modeling a hurricane?s landfall might want to model the result of an extreme rain event prior to the hurricane?s arrival. Without a tool such as ESG the simulation coordinator would be required to know:

  • For my region what constitutes an extreme rain?
  • How can I find an example in the real data of when such an event occurred?
  • What about temporal or spatial variations to my scenario such as the finding the wettest week, month or year?

If we consider combining these questions across multiple parameters, such as temperature, pressure, wind speed, etc. and then add multiple regions and seasons the problem reveals itself to be quite daunting.

The second hurdle facing a modeler who wants to include real environmental effects in the simulation is how to manage many discreet data sources. Often simulation runs face tight time deadlines and lack the manpower necessary to retrieve data from across the network, reformat it for ingest, regrid or resample it to fit the simulation parameters, then incorporate it in model runs. Even if this could be accomplished what confidence can the modeler have in the different data sources and their applicability to the current simulation without becoming expert in each data type? The unfortunate side effect of this is that the environment is often forgotten in simulations or a single environmental database is created and ?canned? to be replayed again and again in the simulation.

The ESG solves this problem for the modeler by providing a 100% Java platform independent client with access to both data mining and database creation capabilities on a network distributed parallel computer cluster with the ability to perform fuzzy logic based searching on an global array of environmental parameters. By providing intelligent instantaneous access to real data it ensures that the modeler is able to include realistic, reliable and detailed environments in their simulation applications.

This demonstration will present the results of data-mining, visualization, and a domain integration tool developed in a network distributed fashion and applied to environmental modeling.

 

3. Satellite Imagery As a Multi-Disciplinary Tool for Environmental Applications
Herbert W. Kroehl, World Data Center for Solar-Terrestrial Physics, National Geophysical Data Center, USA
Eric A. Kihn, NOAA/NGDC, USA
Alexei Gvishiani, RAS/CGDS, Russia
Mikhail N. Zhizhin, RAS/CGDS, Russia

Satellite technologies offer a unique opportunity to monitor the earth and its environment. Environmental satellite data, which initially focussed on “in situ” measurements of the ambient environment, are taking advantage of remote sensing technology through the use of imagers and sounders. Visible, infrared, microwave and ultraviolet emissions are now recorded across a swath as large as 3,000 km by instruments on operational meteorological and earth observing satellites. The resulting radiances are used to compute a disparate set of parameters serving very different scientific disciplines, e.g. space physics and sociology.

What environmental parameters are routinely computed from imagery and soundings recorded on satellites? The imagery on operational weather satellites are used to monitor clouds, snow, ice and solar activity and to construct profiles of atmospheric temperature, humidity and ozone. The same images were found to be useful in assessing the state of the environment, detecting wildfires, tracking the flow of ash from volcanoes, and assessing population dynamics. In addition to improvements for operational instruments, imagers on earth observing systems are used to assess environmental health, classify vegetation, to assess the effects of natural hazards, and to build digital elevation models.

But when the same data are used for many different applications, one scientist’s signal becomes another scientist’s noise, and it becomes important to classify different environmental signals contained in an image. In addition, data mining techniques need automatic classification of images, especially when these images are so voluminous.

A sample of the diverse use of images recorded on weather and earth observing satellites will be presented as a prelude to the need for mathematical techniques to classify information contained in the images.



4. Development of the Space Physics Interactive Data Resource- II (SPIDR II) Experiences Working in a Virtual Laboratory Environment
Eric A. Kihn, NOAA/NGDC, USA
Dr. Mikhail Zhizhin, RAS/CGDS, Russia
Prof. Alexei Gvishiani, RAS/CGDS, Russia
Dr. Herbert W. Kroehl, NOAA/NGDC, USA

SPIDR 2 is a distributed resource for accessing space physics data which was designed and constructed jointly at NGDC and CGDS to support requirements of the Global Observation and Information Network (GOIN) project. SPIDR is designed to allow users to search, browse, retrieve, and display Solar Terrestrial Physics (STP) and DMSP satellite digital data. SPIDR consists of a WWW interface, online data and information, and interactive display programs, advanced data mining and data retrieval programs.

The SPIDR system currently handles the following: DMSP visible, infrared and microwave browse imagery, ionospheric parameters, geomagnetic 1.0 minute and hourly value data, geophysical and solar indices, GOES x-ray, plasma, and magnetometer data, cosmic ray, solar radio telescope, satellite anomaly and city lights data sets. The goal is to manage and distribute all STP digital holdings through the SPIDR system providing comprehensive and authoritative on-line data services, analysis and numerical modeling to the space physics community.

The successful cooperation between NGDC and CGDS has produced the development of a SPIDR-I mirror in 1997, development and launch of SPIDR-II servers in Boulder, Moscow, and Sydney in 1999, additional SPIDR II mirrors in South Africa and Japan in 2000, and the development of a new satellite data systems prototype in 2001.

This presentation will present details of technologies, and methodologies that were successful in producing exceptional results from a geographically distributed team working in a virtual laboratory environment.

 

5. An Automatic Analysis of Long Geoelectromagnetic Time Series: Determination of the Volcanic Activity Precursors
J. Zlotnicki, Observatorie de Physique du Globe de Clermont-Ferrand, France
J-L. LeMouel, Director of the department of Geomagnetism, Institut de Physique du Globe de Paris, France
S.Agayan, Center of Geophysical Data Studies and Telematics Applications IPE RAS, Russia
Sh. Bogoutdinov, Center of Geophysical Data Studies and Telematics Applications IPE RAS, Russia
A. Gvishiani, Director of the Center of Geophysical Data Studies and Telematics Applications, IPE RAS, Russia
V.Mikhailov, Institute for the Physics of the Earth RAS, Russia
S.Tikhotsky, Institute for the Physics of the Earth RAS, Russia

The new methods developed for the geophysical long time series analysis, based on the fuzzy logic approach. These methods include the algorithms for the determination of anomalous signals. They are specially designed and very efficient in the problems where the definition of anomalous signal is fuzzy, i.e. the general signature, amplitude and frequency of the signal can not be prescribed a priory, as in the case of seeking for the precursors of natural disasters in geophysical records. The developed algorithms are able to determine the intervals of the record that are anomalous withrespect to the background signal presented at the record. Another part of algorithms deal with the morphology analysis of signals. These algorithms were applied for the analysis of the electromagnetic records over La Fournaise volcano (Reunion island). For several years five stations measured the the electric field along different directions. The signals specific for the eruption events are determined and correlated over several stations. Another types of signals that correspond to storms and other sources are also determined and classified. The software is designed that helps to analyze the spatial distribution of activity over stations.


6. Application of telematics approaches for solving the problems of distributed environmental monitoring
M. Zgurovsky, A. Novikov, National Technical University of Ukraine, Kiev Polytechnic Institute

The results of research carried out at the Cybernetics Gloushkov Center of National Academy of Sciences of Ukraine are presented. A review of the advanced developments in the field of distributed environmental monitoring is given.

Among the presented developments - the interactive system of modeling and prognosis of ecological, economic and other processes on the basis of observations for support of taking up quick control decisions. The system is based on the inductive method of arguments group accounting used for automatic extraction of the substantial information from the measurement data. The efficiency of the system is demonstrated on applications of modeling and prognosis of dynamics changes of animal plankton concentration, number of microorganisms in contaminated soil and others.

The designs of the mobile laboratory of the quick radiation monitoring (RAMON) and of the automated system for research of subsoil water processes (NADRA) are presented. Problems of the user interface intelletualization in geophysical software are considered.



Track IV-B-5:
Seismic Data Issues


Chair: A. Gvishiani, Director of the Center of Geophysical Data Studies and Telematics Applications IPE RAS, Russia

1. Clustering of Geophysical Data by New Fuzzy Logic Based Algorithms
S.Agayan, Center of Geophysical Data Studies and Telematics Applications IPE RAS, Russia
Sh. Bogoutdinov, Center of Geophysical Data Studies and Telematics Applications IPE RAS, Russia
A. Gvishiani, Director of the Center of Geophysical Data Studies and Telematics Applications IPE RAS, Russia
M. Diament, Institut de Physique du Globe de Paris (IPGP), France
V.Mikhailov, Institute for the Physics of the Earth RAS, Russia
C. Widiwijayanti, Institut de Physique du Globe de Paris (IPGP), France

A new system of clusterization algorithms, based on geometrical model of illumination in the finite-dimensional space, has been developed recently, using fuzzy sets approach. The two major components of the system are RODIN and CRYSTAL algorithms. These two efficient clusterization tools will be presented along with their applications to seismological, gravity and geomagnetic data analysis. The regions of Malucca Sea (Indonesia) and Gulf of San Malo (France) are under consideration. In the course of study of the very complicated geodynamics of the Malucca sea region the clusterization of earthquakes hypocenters with respect to their position, type of faulting and horizontal displacement strike was performed. The results of this procedure made more clear the stress pattern and hence the geodynamical structure of the region. RODIN algorithm was also applied for clustering of the results of anomalous gravity field pseudo-inversion over this region. It improved the solution considerably and helped to determine the depths and horizontal positions of sources of the gravity anomalies. The obtained results correlate well with the results of the local seismic tomography and gravity inversion. In the region of Gulf of San Malo the developed algorithms was successfully used to investigate the structure of quasi-linear magnetic anomalies onshore and offshore.

 

2. Artificial Intelligence Methods in the Analysis of Large Geophysical Data Bases
A. Gvishiani, Director of the Center of Geophysical Data Studies and Telematics Applications IPE RAS, Russia
J.Bonnin, Institut de Physique du Globe de Strasbourg, France

The presentation is devoted to the different kinds of Artificial Intelligence Algorithms, oriented towards geophysical applications: syntactic pattern recognition, geometrical cluster analysis, time series processing and classification, dynamic pattern recognition with learning and other considerations. A big deal of the presentation is devoted to fuzzy logic and fuzzy mathematics applications to artificial intelligence algorithms development. The following geophysical and environmental applications will be presented: recognition of strong earthquake-proun areas in Alps-Perineas and Caucasus, syntactic classification of seismograms and strong ground motion records, identification of anomalies on geoelectrical and gravity data, use of clustering for the interpretation of geomagnetic data.

 

3. Geo- Environmental Assessment of Flash Flood Hazard of the Safaga Terrain, Egypt, Using Remote Sensing Imagary
Maged L. El Rakaiby, Nuclear Materials Authority, Egypt
Mohamed N. Hegazy, National Authority for Remote Sensing and Space Sciences, Egypt
Menas Kafatos, Center for Earth Observing and Space Research, GMU, USA

We emphasize the use of space images for detecting, interpreting and mapping elements of the geological and geomorphologic environment of the Safaga terrain, Egypt to monitor the geomorphologic elements causing flash floods. Safaga town and associated highways are highly affected by flash floods more than once every year. Information interpreted from space images is very useful for reducing flash flood hazard and adjusting the use of the Safaga terrain.


4. On the Modeling of Fast Variations of the Mode of Deformation of Lithospheric Plates
M. Diament, Organization Institut de Physique du Globe de Paris (IPGP), France
Dubois J.-O., Organization Institut de Physique du Globe (IPGP), France
Kedrov E., Center of Center of Geophysical Data Studies and Telematics Applications IPE RAS, Russia
M. Kovalenko, The State Research Institute of Aviation Systems, Russia
Mikhailov V., Institute for the Physics of the Earth RAS, Russia
Murakami Yu., Geological Siurvey of Japan, Japan

This paper discuses possible applications of the new recently obtainedexact solutions of the elasticity theory problems for the domains having corner points. Analysis of the solutions obtained demonstrated that the mode of deformation in the narrow zones along the boundary of such bodies close to the corner points strongly depends on the work of the surface forces released in these points.

Exact solutions for a rectangle principally differs from the classical exact solutions for unbounded domains (e.g. wedge, infinite stripe etc.) or for domains limited by smooth boundary. The explanation is in the fact that properties of corner points differs considerably from properties of the domain they belong to. In particular, such fundamental notion as surfent of area can not be introduced at the corner point, thus effect of this point can be calculated only as an additional work released in the corner point by some fictitious forces and/or torque which are additional to acting surface forces.

When some interval of a body boundary is of a high curvature or contains a corner point and when boundary loading does not neglect there, then small variations of the shape of the boundary in the vicinity of such interval or corner point can cause finite or even infinite variations of the specific energy. This actually means that Saint-Venant principle is not valid for the areas containing corner points. When boundary of an area is strongly irregular then solution depends on how boundary loading accommodates to the intervals of high boundary curvature.

The results obtained makes it possible to consider corner points of lithospheric plates as singular or "trigger" points, probably responsible for the fast observable changes of the mode of deformation along plate boundaries. These fast changes at the plate boundaries could arise not only in result of variation of boundary forces in the vicinity of corner points but also in result of changes of inner structure and/or rheology inside the plates. The last changes could arise as from decompaction of rocks in the vicinity of corner points (as a consequence of earthquakes, tectonic or thermal processes) or, vice versa; arise from rock compaction, taking place during periods of seismic quietness.

This investigation has been performed by the stuff of virtual laboratory on new solution of the elasticity theory designed and maintained by scientists from Russia, Japan, France and USA in the frameworks of joint project supported by International Science and Technology Center. Website designed supports teleconferences, exchange and presentation of results.

 

5. New Mathematical Approach to Seismotectonic Data Studies
M. Kovalenko and N. Tsybin, State Research Institute of Aviation Systems
Yu. Rebetsky, nstitute of Physics of the Earth, Russia
Yu. Murakami, Geological Survey of Japan, Japan

The paper discuses possible applications of the new recently obtained exact solutions of the some classical problems of the elasticity theory for domains having ruptures. Analysis of the solutions obtained demonstrated that the solution for domains with ruptures is non unique. The explanation is in the fact that the properties of apexes of crack differs considerably from properties of the domain they belong to. The stress distribution strongly depends on the work of the surface forces released in these points. Practically it is a question of the work released on micro level. Thus effect of apexes of crack can be calculated only as an additional work released there.

The results obtained makes it possible to consider apexes of fault of lithospheric plates as trigger points, probably responsible for the fast observable changes of the mode of deformation. These fast changes could arise not only in result of variation of boundary forces in the vicinity of apexes of fault but also in result of changes of inner structure and/or rheology inside the plates. Actually it means, that the crack energy may change without increase/decrease of length of crack.

This study has been performed using virtual laboratory approach designed and maintained by scientists from Russia, Japan, France and USA in the frameworks of joint project supported by International Science and Technology Center. Web-site designed supports teleconferences, exchange and presentation of results.

 

Last site update: 15 March 2003