Implications of the OpenGIS® Specification

for Regional Science

An Open GIS Consortium (OGC) White Paper

 

by Lance McKee
Vice President, Corporate Communications
Open GIS Consortium, Inc.
lmckee@opengis.org

 

Abstract: E-commerce, the Web, the OpenGIS Specification, and new tools for data production (in particular, high-resolution commercial earth imaging systems supported by digital orthophotogrammetry, and GPS) will soon make commercial and noncommercial geodata and geoprocessing capabilities of all kinds far more abundant, discoverable, accessible, affordable, and widely used than today. This is of interest to the interdisciplinary field of regional and urban studies because the human activities of interest to regional planners have a strong spatial component and because geoprocessing plays a fundamental role in the disciplines that seek to understand and affect those activities. This article provides an overview of topics that offer a number of opportunities for further research.

  1. Introduction – Interoperability Via Common Software Interfaces

There are three basic sources of non-interoperability in the world of geoprocessing:

  1. GIS, remote sensing, and AM/FM are specialized technologies that use quite different approaches to representing geographic information in software.
  2. Likewise, within any of these technologies, vendors who have packaged the technology in software products have taken different approaches, with different data structures and software architectures.
  3. And finally, using any of these technologies, different practitioners who do not rigorously agree upon naming conventions will use different naming conventions, which makes it difficult to share data and interpret others’ data.

The first two kinds of non-interoperability will soon be managed quite well through software interfaces on commercial geoprocessing products. The third can be managed through a combination of software and data coordination.

It is impossible to standardize the ways in which geographic data is created and manipulated by software, because there are so many diverse application requirements and legacy approaches. Conversion programs and data transfer standards can be helpful but they still require time consuming manipulation of whole data files, and often data is lost or errors are introduced. These previously unavoidable approaches will continue to make sense for some archiving applications, or for preparing large quantities of data created in one system for frequent use in another system.

With its OpenGIS® Specification, the Open GIS Consortium, Inc. (OGC), a not for profit geoprocessing industry consortium, has undertaken to overcome the first two kinds of non-interoperability by specifying open interfaces which comprise a single geoprocessing "lingua franca." The open interface approach depends on the software vendors agreeing (through a collaborative object modeling process) on such things as parameter lists, common command syntax, and primitive data types that make up the interface, and then agreeing to build these interfaces onto their software.

This approach enables System A to be a client and System B to be a server, so that a spatial query entered on System A can be interpreted and processed on System B. The approach enables access to both data and functionality held in heterogeneous systems. Interoperability will still sometimes be imperfect, because System B may not be able to service all the kinds of queries that System A is capable of making, and because certain kinds of information do not translate easily between, for example, an imaging system and a GIS. But the OGC Technical Committee, comprised of users, integrators, academics, and vendors, is taking care to design the interfaces to provide as much interoperability as possible.

The interface approach has the great advantage of providing "transparent access" between systems. That is, it becomes possible for the data on another system to be as readily available to you as the data on your own system. The OpenGIS Specification for geoprocessing interfaces largely eliminates the need for data format standards and costly batch data conversion. A query returns not a whole data file, but only the "result," or the answer to the query, and thus the network model eliminates the need for users to keep (usually outdated) copies of whole data sets. An even greater advantage is that the interface approach enables geoprocessing to become an integral part of the new Internet and Web-based distributed computing paradigm in which applets, middleware, components, e-commerce tools, on-line data servers, and object request brokers give any networked computing device real-time access to a huge universe of data and processing resources. Any Internet-linked device, even cell phones, will be able to access countless data servers and powerful application servers as if all those terabytes of geodata and sophisticated software were on their own local storage media. Remote servers will upload little GIS applets and geo-enabled software components will enable ordinary users to make use of smart digital maps in all kinds of desktop documents. Conventional RDBMSs (through advances not related to the OpenGIS Specification) will soon store complex spatial data and serve it (through OpenGIS conformant interfaces) to a wide variety of GIS and non-GIS applications.

It is important to note that data content standards and metadata standards, like those being developed by the Federal Geographic Data Committee (FGDC) and by state and local data coordination groups, are distinct from vendor-specific internal data standards and are still an essential part of the interoperability solution. But even these "semantic" issues will be addressable, to an important degree, by software: In the OpenGIS Specification, the formal definition of a geographic feature includes metadata. The metadata is essential for "catalog" or "clearinghouse" schemes through which it is possible to discover the existence of data, for example, of a particular theme for a particular region, acquired during a particular time interval. It will be possible to write software that will provide "semantic translation," mapping between dissimilar metadata schemas to find a match that may be less than perfect but that may be acceptable.

1.      Implications of Ubiquitous Access to Network-Resident Geodata and Geoprocessing Services 

E-commerce, the Web, the OpenGIS Specification, RDBMSs capable of storing complex geographic data, and new tools for data production (in particular, high-resolution commercial earth imaging systems supported by digital orthophotogrammetry, and GPS) will soon make commercial and noncommercial geodata and geoprocessing capabilities of all kinds far more abundant, discoverable, accessible, affordable, and widely used than today. This is of interest to the interdisciplinary field of regional and urban studies because the human activities of interest to regional planners have a strong spatial component and because geoprocessing plays a fundamental role in the disciplines that seek to understand and affect those activities.

o        Implications for Research

GIS supports interdisciplinary approaches because the "data layers" used in most spatial analyses come from different sources and were developed for different purposes. The basic data layers, termed "Framework" by the FGDC, include soils, land use, land cover, transportation, elevation, etc., all of which are created by different disciplines. Our information technologies influence our way of seeing the world: GIS causes us to look for relationships between spatially distributed phenomena that we might, unaided, never consider to be related. The interdisciplinary field of regional and urban studies relies heavily on geoprocessing, and so it will benefit from improved access via the Web to thousands o f geodata sources.

Cliff Kottman, OGC’s Chief Scientist, takes a long view of the implications of interoperable geoprocessing for researchers and practitioners:

 

"We are talking about the world of geospatial discourse, which becomes enabled by great geospatial literature. This discourse depends on both the technology and the artist, the gifted writer who creates compelling stories. Arguably, most of the great geospatial literature hasn't happened yet because we are just now inventing the book. The best of the geoprocessing software and applications to date are the early geoprocessing literature that will survive the transition to this emerging geospatial culture. Suppose it is the year 2048. Imagine entering into a geospatial virtual reality that enables you to experience the Mississippi River and its tributaries, with their component precipitation and runoff, leaching, pollution, treatment plant effluents, and flora and fauna. The geospatial "literature" will provide a new way of seeing the Mississippi, just like Don Quixote is a new way of understanding the Spanish middle class and the end of feudalism. The geospatial virtual reality experience will change the way you think and behave, just like reading Don Quixote changed the way people thought about being Spanish. Who will the authors be? Today's geospatial works compare to little books like "Dick and Jane." We have flight simulators and fly-through technology, but these are still thin, lightweight, and they don't change the way we live. In 1300, reading was marvelous and magical to illiterate European peasants, but it was an essential part of the life and work of the educated elite who governed and who carried forward the tenuous thread of classical culture. Now reading is part of the life and work of most people. Similarly, digital geospatial information technologies are not widely used now, but they will be, and their content will be greatly amplified by new geniuses, new receptive audiences/markets, and the growth of traditions, standard usage, and branching styles and orientations. The broader IT industry is almost certain to supply ever more powerful supporting capabilities -- CPU speeds, bandwidth, portability, embedability into information appliances, omnipresent Internet access -- which will meet the needs of the geoprocessing dreamers who will build the great virtual environments and applications."

from OGC’s written statement for the UCGIS Education Summit at GIS/LIS ‘98

The coming era of easy access to thousands of geodata sources will, of course, have its problems. Most notably, there will be serious issues involving semantics, quality, and lineage. More people will be using geoprocessing in their work, and so more people will encounter the frustration of data that cannot be trusted. A likely result will be that disciplines, professions, local governments, and industry associations will assign teams to take responsibility for data coordination activities. They will in most cases seek to harmonize with related "Information Communities" and with national and international geodata standards groups, such as the FGDC and ISO.

o        Implications for Regional and Urban Management

Open systems geoprocessing has important implications for a wide variety of government agencies and departments. A significant amount (perhaps more than 60%) of the information used within and shared between government agencies has a geographic component which qualifies it as "geodata".

Governments serve constituents and manage resources that are distributed spatially. Thus in every major metropolitan area there are hundreds of agencies and departments that could benefit from GISs that help collect, manage, display, and analyze spatial data. Typically, of those groups that have such systems, the software systems and data semantics are incompatible. One result is that data is collected redundantly and sharing data is difficult and expensive. Because developing new data and converting existing data is always the most expensive part of building a GIS capability, the problem of non-interoperability often makes it difficult to justify GIS projects in budgets. Many activities in the public and private sector that depend on spatial data – urban planning, street maintenance, management of public and private utilities’ underground and aboveground facilities, real-estate appraisal, public safety, environmental monitoring, traffic management, siting analysis, etc. – are much less efficient than they could be and physical projects proceed with far less coordination than they would if basic data were readily available to anyone who needs it.

A less obvious result of the historical problem of GIS non-interoperability is that the market for GIS software and services has remained much smaller than it could be. If no significant change occurs in the awareness of local, state, and national data managers, OGC’s technologies will be delivered by the private sector in products which will be adopted in the same piecemeal way that PCs and GIS packages have been adopted in the past. The market for software, data, and services will grow faster than before, but more slowly than it might, and in a disjointed fashion. Alternatively, data coordination efforts might give rise to various public and private approaches to creating and maintaining distributed spatial data warehouses supported by payment schedules based on a variety of public and private needs. The OGETA consortium in the Atlanta Greater Metropolitan Area is one example. (See http://www.ogeta.com.) Some of the key players in OGETA are utilities, who share many of the same problems as municipalities and who can benefit from sharing data with municipalities.

 

o        Implications for Government Policy

In enterprises both private and public, advancing information technologies wreak change. Organizations are an artifact of information flows and missions. Information technology alters information flows at the same time that it alters the environment in which organizations’ missions were formed. Interoperable geoprocessing is no exception. All of the policy issues – hazards and opportunities – raised by the World Wide Web are issues for people concerned about interoperable geoprocessing, and there are others peculiar to geoprocessing.

OGC is rather unique among IT consortia for its commitment to government involvement in specifying new technology. In its Special Interest Groups (SIGs), OGC provides a unique opportunity for domains in society to direct the unfolding of a key technology which will significantly affect lives and businesses. OGC's Technical Committee has a Domain Task Force composed of domain-focused special interest groups (SIGs) and working groups, and there are also SIGs that operate in OGC's Management Committee. In the Telco SIG, for example, telecommunications industry experts shape the special geoprocessing interfaces necessary for their industry. If the public sector is properly represented in OGC, the educational, environmental, and public service applications of distributed geoprocessing have a better chance of unfolding in a way that benefits the public. Information architecture design requires business process expertise as well as technical expertise, and the OpenGIS Specification is being developed by technical experts in the OGC Technical Committee with direction from business and government decision makers in the OGC Management Committee.

USDA NRCS, USGS NMD, FGDC, and other Federal agencies are involved in OGC, providing vendors with an opportunity to stay in tune with those agencies' needs, and providing those agencies with intelligence for both guiding the integration of geospatial interoperability into their architectures and influencing the course of commercial geoprocessing technology development. FGDC is involved with OGC on a number of projects. NIMA is using OGC to support "spiral development" and COTS component based procurements, and this model could be adapted by other federal agencies. NIMA and the US Army Corps of Engineers Topographic Engineering Center (TEC) are sponsoring a Web Mapping Interoperability Initiative in OGC’s new Interoperability Program. Such government sponsored initiatives can have a significant effect on the rollout of commercial technology products, and the technical awareness and market awareness gained by the agencies helps them make better technology policy decisions.

Some of the policy issues facing urban and regional agencies include:

What are the benefits of cooperating with the FGDC in bringing local data into conformance with federal Framework standards?

As the technology comes into place to make Web-based data distribution practical, should local agencies sell or give away their data over the Web, or keep it private?

As the benefits of data coordination become better documented, what initiatives will cities undertake to bring agencies and perhaps utilities and public works contractors into alignment?

What can or should cities do to protect the privacy of citizens with respect to city-developed spatial data about citizens and their property?

What are the proper roles of private sector contractors in developing, maintaining, and distributing city geodata?

 

o        Implications for Regional and Urban Commerce

To a great extent, businesses pursuing expanded markets drive OGC's progress. The vendors cooperate with each other because they know that non-interoperability is the bottleneck that has retarded their market.

Various technology providers will find significant business in urban and regional settings. Software vendors are building database engines, visualization tools, image interpreters, data fusion systems, Web mapping interfaces, geospatial catalogs, and workflow integration approaches. Telcos are building the "spatial dial tone" for online services such as spatially aware yellow pages. Data providers and database providers are tailoring the electronic commerce approaches that will "meter" geographic information to Internet customers.

Internet-available geographic information also has a much wider market significance: The major implications for commerce involve spatially aware, mobile consumers. Through the Web and appropriate network-resident spatial services, thousands of businesses will be able to inform nearby motorists about their products, services, and location. Motorists will be able to easily learn what product and service sources are close by, and their navigation systems will make it easy to find these business sites. Listing fees (similar to fees to be listed in the telephone business directory) and advertising revenues are just two of the profit potentials of such a system. Many of the same internet-resident services that enable commerce will enable non-commercial public benefits such as: improved emergency response; highway safety; traffic avoidance; reductions in travel time, fuel use, and pollution; and other benefits of a motorist-serving internet.

The same set of spatial services and location information that serve motorists would also serve people with pocket-sized GPS-equipped personal communications devices. Cell phone manufacturers are developing 65 gram units with phone, GPS, and screen. When people discover that they can find or offer or deliver products and services while they or their customers are on the move, this is something they will want or need to do, and the market will grow rapidly.

Much of the data will be different from the data researchers and practitioners have grown accustomed to using. Computers will present it to people differently. Computers will use it, unbeknownst to the non-technical user, to provide very specific information or to control simple things. People sending email don’t care about servers, switches, routers, and protocols. Similarly, when a car tells its driver that the nearest pizza place is 1.3 kilometers ahead on the left, or when a backhoe tells its operator that he is dangerously close to a gas main, the driver and operator won’t care about the data format or semantics (though someone will be making a living caring about them).

 

o        Implications for Urban and Regional Life

The introduction of spatial capabilities into the information infrastructure will have a number of effects on peoples’ lives. Many of these effects will be largely unrelated.

As we move to an information economy, more and more people will be employed in information-related jobs, and the introduction of spatial capabilities into the mainstream information infrastructure will add to this phenomenon. Some people will collect spatial data, others will check, compare, and add value to spatial data sets. Police dispatchers will watch the locations of police and police vehicles on their screens. Civil engineering firms and surveyors wil l have new duties related to data collection and management. Cities may save money by employing more spatial data experts, whose work with the spatial data infrastructure will optimize the management of the cities’ physical infrastructure. Some will have jobs that we can’t imagine now, because the ultimate effects of technology insertion are so hard to predict.

Good access to good spatial information will help cities manage public works projects, emergency response, disaster response, and delivery of services such as trash collection and social services.

Environmental management will benefit, and it will be easier for alert managers to find opportunities for one business’s waste to be used as another business’s resource.

Business development authorities, property sellers, realtors, and businesses seeking new facilities will find each other more easily, potentially supporting cities’ economic development goals.

Just as spatially enabled online multimedia yellow pages will help consumers find the nearest source of products and services, individuals may, through community development web pages, find neighbors with whom they share interests, or from whom they can receive certain products or services. Small, locally owned businesses and self-employed individuals could find such an application of network-resident, simple, spatial services to be a very important resource. Widespread use of the spatially enabled web for such purposes could have a significant positive effect on the livability of cities, and of rural regions as well. There is the potential to reduce automobile use and strengthen neighborhood social networks.

But the effects of technology insertion on social, political, and cultural systems are hard to predict, and technology is not always a blessing. For example, in countries with poorly developed cadastral systems, interoperable geoprocessing (and GPS and other technologies) will contribute to the ease of establishing new cadastres. Developing countries may see increased prosperity result from formalized cadastres and land registration, benefiting from the positive relationship between land ownership and farm productivity, and between land ownership and small business development. (Hernando de Soto, The Other Path. 1990. Harper Collins, New York, NY.) There is, however, the danger that governments will use the new cadastre to levy taxes more broadly, and small owners unable to pay the taxes will be forced to sell to the larger owners, defeating the stated land reform and economic development goals of the new cadastre.

2.      Conclusions

The OpenGIS Specification is of interest to readers of Annals of Regional Science for two reasons: Better access to geodata and geoprocessing resources can only benefit those who study regional economics, resource management, location theory, urban and regional planning, transportation and communication, human geography, population distribution and environmental quality. At the same time, some of these phenomena will be affected by better access to geodata and geoprocessing r esources. This article will have succeeded if some researchers are motivated by it to study these technology insertion effects, and if practitioners are motivated to learn more about the ways in which they can take advantage of interoperable geoprocessin g to increase the efficiency of their work and reduce the cost of getting the operational information they need.