Anne Arundel Imports
About
Anne Arundel County has a ton of open data available. This data could be used to dramatically increase the accuracy of the current data in the county. Chiefly, this includes building polygons, address points, and roads.
Goals
The goal of this import is to improve and make on-ground surveying easier by adding building polygons & addresses. Right now I am finding adding POIs very difficult due to not being able to match POIs accurately from my ground survey to the map.
This import will introduce buildings and address points. I've also found a large number of inaccurate TIGER roads, both bad geometries and incorrect road names. The county maintained street network will be used to name incorrect roads and as a background for un-knotting some of the TIGER knots.
Schedule
No fixed schedule, validating data takes differing amounts of time depending on urban/rural density.
Building data will be imported first, followed by a second pass to add missing buildings by importing address data & verify names of addresses match road names.
Import Data
Background
All data being imported is coming from Anne Arundel County open data portal.
Data source site: http://www.aacounty.org/county-maps/
Data license: http://www.aacounty.org/county-maps/data-download/GIS_Disclaimer.pdf
Type of license: Public Domain, and Verbal Confirmation
ODbL Compliance verified: Yes
Import Type
This import will be performed by me and will import the entire building & address datasets. The import will take place using JOSM and manual verification using Bing Satellite imagery.
The buildings aren't being updated frequently by the county but I'm hoping to work on a script after the import to help identify missing addresses in subsequent updates of the address dataset.
Data Preparation
Tagging Plans
building = yes
addr:housenumber = <NUMBER>
addr:street = <STREET>
Data Merge Workflow
Workflow
The building and address datasets were imported into QGIS and all extraneous columns dropped. The shapefiles were then loaded into JOSM using the OpenData importer.
Buildings are then imported by splitting them into small geographic bounds - neighbourhood, peninsula, etc.
As many visual validation problems are fixed as possible (crossing roads, duplicate buildings, etc). Once the data looks clean it is then imported.
After the building data is imported I open a similar extent containing the address data. Since the street names are abbreviated, I manually expand and verify that the addresses match the road that they pertain to. This allows adding a ton of roads that the imported TIGER data didn't have, fixing roads that exist but have bad geometries, and renaming roads that are incorrectly named in the TIGER dataset.
Once the street names match, the addresses are then manually merged into individual buildings where possible. Where there are more than one address for a building polygon, the addresses are left as nodes. I have aligned the nodes into a line on the frontish edge of the building where they are messy in the source data.