
City2Graph is a Python library developed for converting a wide array of geospatial datasets into graph structures. This functionality is crucial for applications involving Graph Neural Networks (GNNs) and various forms of spatial analysis. The library offers an integrated interface for handling data across multiple domains, such as street networks, transportation systems, origin-destination matrices, and points of interest proximities. It aims to empower researchers and practitioners in developing sophisticated GeoAI and geographic data science applications.
The library provides robust features for constructing graphs from diverse data sources. These include morphological datasets like buildings, streets, and land use from OpenStreetMap and Overture Maps, as well as transportation datasets such as public transport information from GTFS. It also supports contiguity datasets (e.g., land use, administrative boundaries) and mobility datasets (e.g., bike-sharing, migration, pedestrian flows). A key capability is its ability to convert geospatial data from GeoPandas and NetworkX formats into tensors compatible with PyTorch Geometric's Data and HeteroData structures, facilitating graph representation learning.
A significant distinction of City2Graph is its capacity to model complex urban systems by managing multiple geospatial relations as heterogeneous graphs. This bridges the gap between traditional GIS methodologies and modern GNNs, catering to a broad spectrum of applications. By supporting standard libraries like PyTorch Geometric, it ensures seamless integration into deep learning workflows. Furthermore, its versatile graph construction interface makes it suitable for network analysis of urban systems, including multi-modal accessibility studies involving street and public transport networks.
Disclaimer: We do not guarantee the accuracy of this information. Our documentation of this website on Geospatial Catalog does not represent any association between Geospatial Catalog and this listing. This summary may contain errors or inaccuracies.
Sign in to leave a comment