I have some QGIS layers and I am performing some spatial queries with them using geopandas. I'm using psycopg2 to make a connection to a local postgres/postgis database. Then I use the read_postgis() function from geopandas to make a spatial query with my layers. Here is my code to get started.
import psycopg2 import geopandas as gpd postgres_connection = psycopg2.connect(host="localhost", port=5432, database="BRE_2019_Test", user="my_username", password="my_password")
Now I do a few spatial queries, this all works correctly.
#this is a bunch of tax parcels in my study area all_2019_parcels = gpd.read_postgis(('SELECT * FROM public."All_Parcels_2019" '), postgres_connection) #these are parcels from the All_Parcels_2019 table which are spatially within the shape of Zone 1a zone_1a_parcels = gpd.read_postgis(('SELECT ap.* FROM public."Zone1a" AS z1a, public."All_Parcels_2019" AS ap WHERE st_intersects(z1a.geom, ap.geom)'), postgres_connection)
zone_1a_parcels returns 1671 records, which is correct. Just to give you a visual, here is a screenshot of the same operation in QGIS using 'select by location'
Now that I have all_2019_parcels as a geopandas dataframe, I want to make a new column called 'Zone' and update the values for these selected parcels as 'Zone1a'.
Here is my best attempt so far and it works, sort of.
all_2019_parcels['Zone'] = zone_1a_parcels['parno'].isin(all_2019_parcels['parno']) all_2019_parcels.loc[all_2019_parcels['Zone'] == True, 'Zone'] = 'Zone1a'
This labels the correct number of parcels as 'Zone1a'. However I have other zones (1b, 1c, and so on). When I run the same lines to select all parcels within Zone 1b for example, then the 'Zone1a' values in that column are overwritten.
I feel like I am going about this incorrectly. There must be a way to make a SQL query and label values in the 'zone' column of all_2019_parcels without creating the zone_1a_parcels object.