Initial commit - Stage 1 working version
Saving current working state before proceeding to Stage 2. Includes: - Backend: Python-based QC validator with shapefile processing - Frontend: Drag-and-drop file upload interface - Sample files for testing - Documentation and revision history 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
commit
12407b74e4
23
.gitignore
vendored
Normal file
23
.gitignore
vendored
Normal file
@ -0,0 +1,23 @@
|
||||
# Python
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
*.so
|
||||
.Python
|
||||
env/
|
||||
venv/
|
||||
ENV/
|
||||
|
||||
# Logs
|
||||
*.log
|
||||
|
||||
# Temporary files
|
||||
temp/
|
||||
*.tmp
|
||||
|
||||
# Archives (keeping original samplefiles directory)
|
||||
*.zip
|
||||
|
||||
# OS files
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
191
README.md
Normal file
191
README.md
Normal file
@ -0,0 +1,191 @@
|
||||
# Verofy HLD Auto Upload & QC Tool - Phase 1
|
||||
|
||||
A web-based tool for uploading and validating Verofy HLD shapefiles with comprehensive Quality Control checks.
|
||||
|
||||
## Features
|
||||
|
||||
- Drag & drop ZIP file upload interface
|
||||
- Comprehensive QC validation for 10 required shapefiles
|
||||
- WGS 84 projection verification
|
||||
- Attribute field validation with detailed error reporting
|
||||
- VerofyMapID collection on successful QC
|
||||
|
||||
## Architecture
|
||||
|
||||
- **Frontend**: JavaScript web application with drag & drop interface
|
||||
- **Backend**: FastAPI Python server with shapefile validation
|
||||
- **Port**: 8000 (backend), open index.html directly for frontend
|
||||
|
||||
## Installation
|
||||
|
||||
### Prerequisites
|
||||
- Python 3.10+
|
||||
- uv package manager (or pip)
|
||||
|
||||
### Backend Setup
|
||||
|
||||
```bash
|
||||
cd backend
|
||||
|
||||
# Install uv (if not already installed)
|
||||
curl -LsSf https://astral.sh/uv/install.sh | sh
|
||||
|
||||
# Create virtual environment and install dependencies
|
||||
~/.local/bin/uv venv
|
||||
~/.local/bin/uv pip install -r requirements.txt
|
||||
```
|
||||
|
||||
## Running the Application
|
||||
|
||||
### Start Backend Server
|
||||
|
||||
```bash
|
||||
cd backend
|
||||
source .venv/bin/activate
|
||||
python main.py
|
||||
```
|
||||
|
||||
The server will start on `http://localhost:8000`
|
||||
|
||||
### Open Frontend
|
||||
|
||||
Open `frontend/index.html` in your web browser. The frontend will automatically connect to the backend at `http://localhost:8000`.
|
||||
|
||||
## Usage
|
||||
|
||||
1. Open `frontend/index.html` in your browser
|
||||
2. Drag and drop a ZIP file containing shapefiles (or click to browse)
|
||||
3. The system will:
|
||||
- Extract the ZIP file
|
||||
- Run comprehensive QC validation
|
||||
- If QC **fails**: Download a detailed error report (QC_report.txt)
|
||||
- If QC **passes**: Show a success message and prompt for VerofyMapID
|
||||
|
||||
## QC Validation Rules
|
||||
|
||||
The tool validates the following for each shapefile upload:
|
||||
|
||||
### Required Shapefiles (all 10 must be present)
|
||||
- poles
|
||||
- network_elements
|
||||
- splicing
|
||||
- sites
|
||||
- parcels
|
||||
- permits
|
||||
- cabinet_boundaries
|
||||
- segments
|
||||
- access_points
|
||||
- cables
|
||||
|
||||
### General Validation
|
||||
1. **Projection**: All shapefiles must be in WGS 84 projection
|
||||
2. **UID Field**: All shapefiles must have a UID field with unique integer values
|
||||
|
||||
### Shapefile-Specific Attribute Validation
|
||||
|
||||
#### Segments
|
||||
- Type: Must be one of [Aerial, 3rd Party Duct, Underground, Existing VERO, Drop Cable]
|
||||
- Group_01: Must be "Zone XX" format (XX = 2 digits)
|
||||
- Conduit: Required for Underground type, format "(3|1)-1.25\" SDR 13.5 HDPE"
|
||||
|
||||
#### Access Points
|
||||
- Type: Must be [Handhole, Cabinet]
|
||||
- Group_01: Must be "Zone XX" format
|
||||
- Latitude: Must be a number
|
||||
- Longitude: Must be a number
|
||||
|
||||
#### Cabinet Boundaries
|
||||
- Name: Must be "Zone XX Boundary" format
|
||||
|
||||
#### Permits
|
||||
- Name: Must start with "ROW", "ROE", or "LLP"
|
||||
|
||||
#### Cables
|
||||
- Name: Must begin with "XXXF." format (XXX = 3 digits)
|
||||
|
||||
#### Parcels
|
||||
- Name: Must be exactly "Parcel"
|
||||
- Group_01: Must be "Zone XX" format
|
||||
|
||||
#### Sites
|
||||
- Type: Must be one of [School, Hub Site, MDU, Administration, MTU, Dwelling Unit, Vendor Location, Cell Tower, Government, Data Center, Hosptial, Internet, Large Business, Library, Museum, Power Substation, Small Business, Small Cell, Stadium, University, Splice Point, ILA, SFR, Vacant Lot, Mobile Home, Meet Me]
|
||||
- Address: Must be populated
|
||||
- State: Must be 2 letters
|
||||
- Zip: Must be 5 digits
|
||||
- BEN#: Must be an integer
|
||||
- Latitude: Must be a number
|
||||
- Longitude: Must be a number
|
||||
|
||||
#### Splicing
|
||||
- AKA: Must begin with "YYY_Y" format (Y = letters)
|
||||
- Type: Must be [MST, Splice, FTP]
|
||||
- Group_01: Must be "Zone XX" format
|
||||
- Latitude: Must be a number
|
||||
- Longitude: Must be a number
|
||||
|
||||
#### Network Elements
|
||||
- Type: Must be [Slack Coil, Anchor, Bore Pit, Riser]
|
||||
- Group_01: Must be "Zone XX" format
|
||||
- Latitude: Must be a number
|
||||
- Longitude: Must be a number
|
||||
|
||||
#### Poles
|
||||
- Pole_Tag: Must be populated
|
||||
- Pole_Owner: Must be populated
|
||||
- Group_01: Must be "Zone XX" format
|
||||
- Latitude: Must be a number
|
||||
- Longitude: Must be a number
|
||||
|
||||
## Error Reporting
|
||||
|
||||
When QC fails:
|
||||
- A `QC_report.txt` file is automatically downloaded
|
||||
- The report lists all validation errors
|
||||
- If more than 10 features fail the same validation, it shows "10 or more features failed XXX"
|
||||
- If a shapefile fails multiple QC issues, each is listed separately
|
||||
|
||||
## Project Structure
|
||||
|
||||
```
|
||||
dragnddrop/
|
||||
├── backend/
|
||||
│ ├── .venv/ # Virtual environment
|
||||
│ ├── main.py # FastAPI server
|
||||
│ ├── qc_validator.py # QC validation logic
|
||||
│ ├── requirements.txt # Python dependencies
|
||||
│ └── server.log # Server logs
|
||||
├── frontend/
|
||||
│ ├── index.html # Main UI
|
||||
│ ├── upload.js # Upload logic
|
||||
│ └── style.css # Styling
|
||||
├── temp/ # Temporary upload directory
|
||||
├── samplefiles/ # Sample shapefiles for reference
|
||||
└── README.md # This file
|
||||
```
|
||||
|
||||
## Phase 1 Notes
|
||||
|
||||
- Phase 1 focuses on QC validation only
|
||||
- VerofyMapID is collected but not yet sent to any API (planned for Phase 2)
|
||||
- The `verofy_api/` and `oldqc/` folders are for Phase 2 and not used in Phase 1
|
||||
|
||||
## Development
|
||||
|
||||
### Dependencies
|
||||
- **fastapi**: Web framework
|
||||
- **uvicorn**: ASGI server
|
||||
- **python-multipart**: File upload support
|
||||
- **pyshp**: Shapefile reading library
|
||||
|
||||
### Testing
|
||||
|
||||
Test with the sample upload file:
|
||||
```bash
|
||||
curl -X POST -F "file=@sampleupload.zip" http://localhost:8000/upload
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
- **CORS errors**: Make sure the backend is running on port 8000
|
||||
- **File not found errors**: Ensure ZIP contains shapefiles in root or one subdirectory
|
||||
- **Validation errors**: Check QC_report.txt for detailed error messages
|
||||
70
backend/main.py
Normal file
70
backend/main.py
Normal file
@ -0,0 +1,70 @@
|
||||
from fastapi import FastAPI, File, UploadFile
|
||||
from fastapi.responses import FileResponse, PlainTextResponse
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
import zipfile
|
||||
import os
|
||||
import shutil
|
||||
from pathlib import Path
|
||||
from qc_validator import validate_shapefiles
|
||||
|
||||
app = FastAPI()
|
||||
|
||||
# Enable CORS for frontend
|
||||
app.add_middleware(
|
||||
CORSMiddleware,
|
||||
allow_origins=["*"],
|
||||
allow_credentials=True,
|
||||
allow_methods=["*"],
|
||||
allow_headers=["*"],
|
||||
)
|
||||
|
||||
TEMP_DIR = Path("../temp")
|
||||
TEMP_DIR.mkdir(exist_ok=True)
|
||||
|
||||
@app.post("/upload")
|
||||
async def upload_shapefile(file: UploadFile = File(...)):
|
||||
"""Handle shapefile ZIP upload and QC validation"""
|
||||
|
||||
# Clear temp directory
|
||||
for item in TEMP_DIR.glob("*"):
|
||||
if item.is_file():
|
||||
item.unlink()
|
||||
elif item.is_dir():
|
||||
shutil.rmtree(item)
|
||||
|
||||
# Save uploaded file
|
||||
zip_path = TEMP_DIR / file.filename
|
||||
with open(zip_path, "wb") as f:
|
||||
content = await file.read()
|
||||
f.write(content)
|
||||
|
||||
# Unzip file
|
||||
try:
|
||||
with zipfile.ZipFile(zip_path, 'r') as zip_ref:
|
||||
zip_ref.extractall(TEMP_DIR)
|
||||
except Exception as e:
|
||||
return PlainTextResponse(f"Error extracting ZIP file: {str(e)}", status_code=400)
|
||||
|
||||
# Run QC validation
|
||||
qc_result = validate_shapefiles(TEMP_DIR)
|
||||
|
||||
if qc_result["passed"]:
|
||||
return {"message": "success"}
|
||||
else:
|
||||
# Generate QC report
|
||||
report_path = TEMP_DIR / "QC_report.txt"
|
||||
with open(report_path, "w") as f:
|
||||
f.write("QC VALIDATION FAILED\n")
|
||||
f.write("=" * 50 + "\n\n")
|
||||
for error in qc_result["errors"]:
|
||||
f.write(f"{error}\n")
|
||||
|
||||
return FileResponse(
|
||||
path=report_path,
|
||||
media_type="text/plain",
|
||||
filename="QC_report.txt"
|
||||
)
|
||||
|
||||
if __name__ == "__main__":
|
||||
import uvicorn
|
||||
uvicorn.run(app, host="0.0.0.0", port=8000)
|
||||
893
backend/qc_validator.py
Normal file
893
backend/qc_validator.py
Normal file
@ -0,0 +1,893 @@
|
||||
import shapefile
|
||||
from pathlib import Path
|
||||
import re
|
||||
|
||||
# Required shapefiles
|
||||
REQUIRED_SHAPEFILES = [
|
||||
"poles",
|
||||
"network_elements",
|
||||
"splicing",
|
||||
"sites",
|
||||
"parcels",
|
||||
"permits",
|
||||
"cabinet_boundaries",
|
||||
"segments",
|
||||
"access_points",
|
||||
"cables"
|
||||
]
|
||||
|
||||
# WGS 84 projection string (EPSG:4326)
|
||||
WGS84_PROJ = 'GEOGCS["GCS_WGS_1984",DATUM["D_WGS_1984",SPHEROID["WGS_1984",6378137.0,298.257223563]],PRIMEM["Greenwich",0.0],UNIT["Degree",0.0174532925199433]]'
|
||||
|
||||
|
||||
def validate_shapefiles(temp_dir: Path):
|
||||
"""Main QC validation function"""
|
||||
errors = []
|
||||
|
||||
# Find the directory containing shapefiles (might be in subdirectory)
|
||||
shapefile_dir = find_shapefile_directory(temp_dir)
|
||||
if not shapefile_dir:
|
||||
errors.append("No shapefiles found in the uploaded ZIP")
|
||||
return {"passed": False, "errors": errors}
|
||||
|
||||
# Check all required shapefiles exist
|
||||
missing = check_required_shapefiles(shapefile_dir)
|
||||
if missing:
|
||||
for shapefile_name in missing:
|
||||
errors.append(f"Missing required shapefile: {shapefile_name}")
|
||||
return {"passed": False, "errors": errors}
|
||||
|
||||
# Validate each shapefile
|
||||
for shapefile_name in REQUIRED_SHAPEFILES:
|
||||
shp_path = shapefile_dir / f"{shapefile_name}.shp"
|
||||
|
||||
# Validate projection
|
||||
proj_errors = validate_projection(shp_path, shapefile_name)
|
||||
errors.extend(proj_errors)
|
||||
|
||||
# Validate UID field
|
||||
uid_errors = validate_uid_field(shp_path, shapefile_name)
|
||||
errors.extend(uid_errors)
|
||||
|
||||
# Validate attributes based on shapefile type
|
||||
attr_errors = validate_attributes(shp_path, shapefile_name)
|
||||
errors.extend(attr_errors)
|
||||
|
||||
# Perform spatial validation (features within correct cabinet boundaries)
|
||||
spatial_errors = validate_spatial_containment(shapefile_dir)
|
||||
errors.extend(spatial_errors)
|
||||
|
||||
return {"passed": len(errors) == 0, "errors": errors}
|
||||
|
||||
|
||||
def find_shapefile_directory(temp_dir: Path):
|
||||
"""Find the directory containing the shapefiles (may be in subdirectory)"""
|
||||
# Check root directory first
|
||||
shp_files = list(temp_dir.glob("*.shp"))
|
||||
if shp_files:
|
||||
return temp_dir
|
||||
|
||||
# Check subdirectories
|
||||
for subdir in temp_dir.iterdir():
|
||||
if subdir.is_dir():
|
||||
shp_files = list(subdir.glob("*.shp"))
|
||||
if shp_files:
|
||||
return subdir
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def check_required_shapefiles(shapefile_dir: Path):
|
||||
"""Check if all required shapefiles exist"""
|
||||
missing = []
|
||||
for shapefile_name in REQUIRED_SHAPEFILES:
|
||||
shp_path = shapefile_dir / f"{shapefile_name}.shp"
|
||||
if not shp_path.exists():
|
||||
missing.append(shapefile_name)
|
||||
return missing
|
||||
|
||||
|
||||
def validate_projection(shp_path: Path, shapefile_name: str):
|
||||
"""Validate shapefile is in WGS 84 projection"""
|
||||
errors = []
|
||||
prj_path = shp_path.with_suffix('.prj')
|
||||
|
||||
if not prj_path.exists():
|
||||
errors.append(f"{shapefile_name}: Missing .prj file")
|
||||
return errors
|
||||
|
||||
with open(prj_path, 'r') as f:
|
||||
proj_content = f.read().strip()
|
||||
|
||||
# Check if it contains WGS 84 identifiers
|
||||
if 'WGS_1984' not in proj_content and 'WGS84' not in proj_content:
|
||||
errors.append(f"{shapefile_name}: Not in WGS 84 projection")
|
||||
|
||||
return errors
|
||||
|
||||
|
||||
def validate_uid_field(shp_path: Path, shapefile_name: str):
|
||||
"""Validate UID field exists and contains unique integers"""
|
||||
errors = []
|
||||
|
||||
try:
|
||||
sf = shapefile.Reader(str(shp_path))
|
||||
except Exception as e:
|
||||
errors.append(f"{shapefile_name}: Error reading shapefile - {str(e)}")
|
||||
return errors
|
||||
|
||||
# Check if UID field exists
|
||||
field_names = [field[0] for field in sf.fields[1:]]
|
||||
if 'UID' not in field_names:
|
||||
errors.append(f"{shapefile_name}: Missing UID field")
|
||||
return errors
|
||||
|
||||
# Get UID field index
|
||||
uid_index = field_names.index('UID')
|
||||
|
||||
# Collect UIDs and validate
|
||||
uids = []
|
||||
non_integer_count = 0
|
||||
|
||||
for idx, record in enumerate(sf.records()):
|
||||
uid = record[uid_index]
|
||||
|
||||
# Check if integer
|
||||
if not isinstance(uid, int):
|
||||
try:
|
||||
uid = int(uid)
|
||||
except (ValueError, TypeError):
|
||||
non_integer_count += 1
|
||||
if non_integer_count <= 10:
|
||||
errors.append(f"{shapefile_name}: UID at feature index {idx} is not an integer")
|
||||
continue
|
||||
|
||||
uids.append(uid)
|
||||
|
||||
if non_integer_count > 10:
|
||||
errors.append(f"{shapefile_name}: 10 or more features failed UID is not an integer")
|
||||
|
||||
# Check for uniqueness
|
||||
if len(uids) != len(set(uids)):
|
||||
duplicate_count = len(uids) - len(set(uids))
|
||||
if duplicate_count >= 10:
|
||||
errors.append(f"{shapefile_name}: 10 or more features failed UID is not unique")
|
||||
else:
|
||||
errors.append(f"{shapefile_name}: UID field contains {duplicate_count} duplicate values")
|
||||
|
||||
sf.close()
|
||||
return errors
|
||||
|
||||
|
||||
def validate_attributes(shp_path: Path, shapefile_name: str):
|
||||
"""Validate shapefile-specific attributes"""
|
||||
|
||||
validators = {
|
||||
"segments": validate_segments,
|
||||
"access_points": validate_access_points,
|
||||
"cabinet_boundaries": validate_cabinet_boundaries,
|
||||
"permits": validate_permits,
|
||||
"cables": validate_cables,
|
||||
"parcels": validate_parcels,
|
||||
"sites": validate_sites,
|
||||
"splicing": validate_splicing,
|
||||
"network_elements": validate_network_elements,
|
||||
"poles": validate_poles
|
||||
}
|
||||
|
||||
validator = validators.get(shapefile_name)
|
||||
if validator:
|
||||
return validator(shp_path, shapefile_name)
|
||||
|
||||
return []
|
||||
|
||||
|
||||
def validate_segments(shp_path: Path, shapefile_name: str):
|
||||
"""Validate segments shapefile attributes"""
|
||||
errors = []
|
||||
|
||||
try:
|
||||
sf = shapefile.Reader(str(shp_path))
|
||||
field_names = [field[0] for field in sf.fields[1:]]
|
||||
|
||||
# Check required fields (Group 1 with space, not Group_01)
|
||||
required_fields = ['Type', 'Group 1', 'Conduit']
|
||||
for field in required_fields:
|
||||
if field not in field_names:
|
||||
errors.append(f"{shapefile_name}: Missing required field '{field}'")
|
||||
return errors
|
||||
|
||||
type_idx = field_names.index('Type')
|
||||
group_idx = field_names.index('Group 1')
|
||||
conduit_idx = field_names.index('Conduit')
|
||||
|
||||
valid_types = ['Aerial', '3rd Party Duct', 'Underground', 'Existing VERO', 'Drop Cable']
|
||||
failure_counts = {'type': 0, 'group': 0, 'conduit': 0}
|
||||
|
||||
for idx, record in enumerate(sf.records()):
|
||||
# Validate Type
|
||||
if record[type_idx] not in valid_types:
|
||||
failure_counts['type'] += 1
|
||||
if failure_counts['type'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} has invalid Type value")
|
||||
|
||||
# Validate Group 1 format (Zone XX)
|
||||
group_val = str(record[group_idx]) if record[group_idx] else ""
|
||||
if not re.match(r'^Zone \d{2}$', group_val):
|
||||
failure_counts['group'] += 1
|
||||
if failure_counts['group'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} has invalid Group 1 format (should be 'Zone XX')")
|
||||
|
||||
# Validate Conduit (only for Underground)
|
||||
if record[type_idx] == 'Underground':
|
||||
conduit_val = str(record[conduit_idx]).strip() if record[conduit_idx] else ""
|
||||
# Check if first 8 characters match "(1)-1.25" or "(3)-1.25"
|
||||
# Using regex to handle any quote-like character
|
||||
if not re.match(r'^\([13]\)-1\.25.', conduit_val):
|
||||
failure_counts['conduit'] += 1
|
||||
if failure_counts['conduit'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} has invalid Conduit value for Underground type (must start with '(1)-1.25\"' or '(3)-1.25\"')")
|
||||
|
||||
for key, count in failure_counts.items():
|
||||
if count > 10:
|
||||
errors.append(f"{shapefile_name}: 10 or more features failed {key} validation")
|
||||
|
||||
sf.close()
|
||||
except Exception as e:
|
||||
errors.append(f"{shapefile_name}: Error validating attributes - {str(e)}")
|
||||
|
||||
return errors
|
||||
|
||||
|
||||
def validate_access_points(shp_path: Path, shapefile_name: str):
|
||||
"""Validate access_points shapefile attributes"""
|
||||
errors = []
|
||||
|
||||
try:
|
||||
sf = shapefile.Reader(str(shp_path))
|
||||
field_names = [field[0] for field in sf.fields[1:]]
|
||||
|
||||
required_fields = ['Type', 'Group 1', 'Latitude', 'Longitude']
|
||||
for field in required_fields:
|
||||
if field not in field_names:
|
||||
errors.append(f"{shapefile_name}: Missing required field '{field}'")
|
||||
return errors
|
||||
|
||||
type_idx = field_names.index('Type')
|
||||
group_idx = field_names.index('Group 1')
|
||||
lat_idx = field_names.index('Latitude')
|
||||
lon_idx = field_names.index('Longitude')
|
||||
|
||||
valid_types = ['Handhole', 'Cabinet']
|
||||
failure_counts = {'type': 0, 'group': 0, 'lat': 0, 'lon': 0}
|
||||
|
||||
for idx, record in enumerate(sf.records()):
|
||||
if record[type_idx] not in valid_types:
|
||||
failure_counts['type'] += 1
|
||||
if failure_counts['type'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} has invalid Type")
|
||||
|
||||
group_val = str(record[group_idx]) if record[group_idx] else ""
|
||||
if not re.match(r'^Zone \d{2}$', group_val):
|
||||
failure_counts['group'] += 1
|
||||
if failure_counts['group'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} has invalid Group 1 format")
|
||||
|
||||
try:
|
||||
float(record[lat_idx])
|
||||
except (ValueError, TypeError):
|
||||
failure_counts['lat'] += 1
|
||||
if failure_counts['lat'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} Latitude is not a number")
|
||||
|
||||
try:
|
||||
float(record[lon_idx])
|
||||
except (ValueError, TypeError):
|
||||
failure_counts['lon'] += 1
|
||||
if failure_counts['lon'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} Longitude is not a number")
|
||||
|
||||
for key, count in failure_counts.items():
|
||||
if count > 10:
|
||||
errors.append(f"{shapefile_name}: 10 or more features failed {key} validation")
|
||||
|
||||
sf.close()
|
||||
except Exception as e:
|
||||
errors.append(f"{shapefile_name}: Error validating attributes - {str(e)}")
|
||||
|
||||
return errors
|
||||
|
||||
|
||||
def validate_cabinet_boundaries(shp_path: Path, shapefile_name: str):
|
||||
"""Validate cabinet_boundaries shapefile attributes"""
|
||||
errors = []
|
||||
|
||||
try:
|
||||
sf = shapefile.Reader(str(shp_path))
|
||||
field_names = [field[0] for field in sf.fields[1:]]
|
||||
|
||||
if 'Name' not in field_names:
|
||||
errors.append(f"{shapefile_name}: Missing required field 'Name'")
|
||||
return errors
|
||||
|
||||
name_idx = field_names.index('Name')
|
||||
failure_count = 0
|
||||
|
||||
for idx, record in enumerate(sf.records()):
|
||||
name_val = str(record[name_idx]) if record[name_idx] else ""
|
||||
if not re.match(r'^Zone \d{2} Boundary$', name_val):
|
||||
failure_count += 1
|
||||
if failure_count <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} has invalid Name format (should be 'Zone XX Boundary')")
|
||||
|
||||
if failure_count > 10:
|
||||
errors.append(f"{shapefile_name}: 10 or more features failed Name validation")
|
||||
|
||||
sf.close()
|
||||
except Exception as e:
|
||||
errors.append(f"{shapefile_name}: Error validating attributes - {str(e)}")
|
||||
|
||||
return errors
|
||||
|
||||
|
||||
def validate_permits(shp_path: Path, shapefile_name: str):
|
||||
"""Validate permits shapefile attributes"""
|
||||
errors = []
|
||||
|
||||
try:
|
||||
sf = shapefile.Reader(str(shp_path))
|
||||
field_names = [field[0] for field in sf.fields[1:]]
|
||||
|
||||
if 'Name' not in field_names:
|
||||
errors.append(f"{shapefile_name}: Missing required field 'Name'")
|
||||
return errors
|
||||
|
||||
name_idx = field_names.index('Name')
|
||||
failure_count = 0
|
||||
|
||||
for idx, record in enumerate(sf.records()):
|
||||
name_val = str(record[name_idx]) if record[name_idx] else ""
|
||||
if not (name_val.startswith('ROW') or name_val.startswith('ROE') or name_val.startswith('LLP')):
|
||||
failure_count += 1
|
||||
if failure_count <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} Name does not start with ROW, ROE, or LLP")
|
||||
|
||||
if failure_count > 10:
|
||||
errors.append(f"{shapefile_name}: 10 or more features failed Name validation")
|
||||
|
||||
sf.close()
|
||||
except Exception as e:
|
||||
errors.append(f"{shapefile_name}: Error validating attributes - {str(e)}")
|
||||
|
||||
return errors
|
||||
|
||||
|
||||
def validate_cables(shp_path: Path, shapefile_name: str):
|
||||
"""Validate cables shapefile attributes"""
|
||||
errors = []
|
||||
|
||||
try:
|
||||
sf = shapefile.Reader(str(shp_path))
|
||||
field_names = [field[0] for field in sf.fields[1:]]
|
||||
|
||||
if 'Name' not in field_names:
|
||||
errors.append(f"{shapefile_name}: Missing required field 'Name'")
|
||||
return errors
|
||||
|
||||
name_idx = field_names.index('Name')
|
||||
failure_count = 0
|
||||
|
||||
for idx, record in enumerate(sf.records()):
|
||||
name_val = str(record[name_idx]) if record[name_idx] else ""
|
||||
if not re.match(r'^\d{3}F', name_val):
|
||||
failure_count += 1
|
||||
if failure_count <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} Name does not begin with XXXF format (three digits followed by capital F)")
|
||||
|
||||
if failure_count > 10:
|
||||
errors.append(f"{shapefile_name}: 10 or more features failed Name validation")
|
||||
|
||||
sf.close()
|
||||
except Exception as e:
|
||||
errors.append(f"{shapefile_name}: Error validating attributes - {str(e)}")
|
||||
|
||||
return errors
|
||||
|
||||
|
||||
def validate_parcels(shp_path: Path, shapefile_name: str):
|
||||
"""Validate parcels shapefile attributes"""
|
||||
errors = []
|
||||
|
||||
try:
|
||||
sf = shapefile.Reader(str(shp_path))
|
||||
field_names = [field[0] for field in sf.fields[1:]]
|
||||
|
||||
required_fields = ['Name', 'Group 1']
|
||||
for field in required_fields:
|
||||
if field not in field_names:
|
||||
errors.append(f"{shapefile_name}: Missing required field '{field}'")
|
||||
return errors
|
||||
|
||||
name_idx = field_names.index('Name')
|
||||
group_idx = field_names.index('Group 1')
|
||||
|
||||
failure_counts = {'name': 0, 'group': 0}
|
||||
|
||||
for idx, record in enumerate(sf.records()):
|
||||
if record[name_idx] != 'Parcel':
|
||||
failure_counts['name'] += 1
|
||||
if failure_counts['name'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} Name must be exactly 'Parcel'")
|
||||
|
||||
group_val = str(record[group_idx]) if record[group_idx] else ""
|
||||
if not re.match(r'^Zone \d{2}$', group_val):
|
||||
failure_counts['group'] += 1
|
||||
if failure_counts['group'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} has invalid Group 1 format")
|
||||
|
||||
for key, count in failure_counts.items():
|
||||
if count > 10:
|
||||
errors.append(f"{shapefile_name}: 10 or more features failed {key} validation")
|
||||
|
||||
sf.close()
|
||||
except Exception as e:
|
||||
errors.append(f"{shapefile_name}: Error validating attributes - {str(e)}")
|
||||
|
||||
return errors
|
||||
|
||||
|
||||
def validate_sites(shp_path: Path, shapefile_name: str):
|
||||
"""Validate sites shapefile attributes"""
|
||||
errors = []
|
||||
|
||||
try:
|
||||
sf = shapefile.Reader(str(shp_path))
|
||||
field_names = [field[0] for field in sf.fields[1:]]
|
||||
|
||||
required_fields = ['Type', 'Address', 'State', 'Zip', 'BEN#', 'Latitude', 'Longitude']
|
||||
for field in required_fields:
|
||||
if field not in field_names:
|
||||
errors.append(f"{shapefile_name}: Missing required field '{field}'")
|
||||
return errors
|
||||
|
||||
type_idx = field_names.index('Type')
|
||||
address_idx = field_names.index('Address')
|
||||
state_idx = field_names.index('State')
|
||||
zip_idx = field_names.index('Zip')
|
||||
ben_idx = field_names.index('BEN#')
|
||||
lat_idx = field_names.index('Latitude')
|
||||
lon_idx = field_names.index('Longitude')
|
||||
|
||||
valid_types = ['School', 'Hub Site', 'MDU', 'Administration', 'MTU', 'Dwelling Unit',
|
||||
'Vendor Location', 'Cell Tower', 'Government', 'Data Center', 'Hosptial',
|
||||
'Internet', 'Large Business', 'Library', 'Museum', 'Power Substation',
|
||||
'Small Business', 'Small Cell', 'Stadium', 'University', 'Splice Point',
|
||||
'ILA', 'SFR', 'Vacant Lot', 'Mobile Home', 'Meet Me']
|
||||
|
||||
failure_counts = {'type': 0, 'address': 0, 'state': 0, 'zip': 0, 'ben': 0, 'lat': 0, 'lon': 0}
|
||||
|
||||
for idx, record in enumerate(sf.records()):
|
||||
if record[type_idx] not in valid_types:
|
||||
failure_counts['type'] += 1
|
||||
if failure_counts['type'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} has invalid Type")
|
||||
|
||||
if not record[address_idx] or str(record[address_idx]).strip() == '':
|
||||
failure_counts['address'] += 1
|
||||
if failure_counts['address'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} Address must be populated")
|
||||
|
||||
state_val = str(record[state_idx]) if record[state_idx] else ""
|
||||
if not re.match(r'^[A-Z]{2}$', state_val):
|
||||
failure_counts['state'] += 1
|
||||
if failure_counts['state'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} State must be 2 letters")
|
||||
|
||||
zip_val = str(record[zip_idx]) if record[zip_idx] else ""
|
||||
if not re.match(r'^\d{5}$', zip_val):
|
||||
failure_counts['zip'] += 1
|
||||
if failure_counts['zip'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} Zip must be 5 digits")
|
||||
|
||||
try:
|
||||
int(record[ben_idx])
|
||||
except (ValueError, TypeError):
|
||||
failure_counts['ben'] += 1
|
||||
if failure_counts['ben'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} BEN# must be an integer")
|
||||
|
||||
try:
|
||||
float(record[lat_idx])
|
||||
except (ValueError, TypeError):
|
||||
failure_counts['lat'] += 1
|
||||
if failure_counts['lat'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} Latitude is not a number")
|
||||
|
||||
try:
|
||||
float(record[lon_idx])
|
||||
except (ValueError, TypeError):
|
||||
failure_counts['lon'] += 1
|
||||
if failure_counts['lon'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} Longitude is not a number")
|
||||
|
||||
for key, count in failure_counts.items():
|
||||
if count > 10:
|
||||
errors.append(f"{shapefile_name}: 10 or more features failed {key} validation")
|
||||
|
||||
sf.close()
|
||||
except Exception as e:
|
||||
errors.append(f"{shapefile_name}: Error validating attributes - {str(e)}")
|
||||
|
||||
return errors
|
||||
|
||||
|
||||
def validate_splicing(shp_path: Path, shapefile_name: str):
|
||||
"""Validate splicing shapefile attributes"""
|
||||
errors = []
|
||||
|
||||
try:
|
||||
sf = shapefile.Reader(str(shp_path))
|
||||
field_names = [field[0] for field in sf.fields[1:]]
|
||||
|
||||
required_fields = ['AKA', 'Type', 'Group 1', 'Latitude', 'Longitude']
|
||||
for field in required_fields:
|
||||
if field not in field_names:
|
||||
errors.append(f"{shapefile_name}: Missing required field '{field}'")
|
||||
return errors
|
||||
|
||||
aka_idx = field_names.index('AKA')
|
||||
type_idx = field_names.index('Type')
|
||||
group_idx = field_names.index('Group 1')
|
||||
lat_idx = field_names.index('Latitude')
|
||||
lon_idx = field_names.index('Longitude')
|
||||
|
||||
valid_types = ['MST', 'Splice', 'FTP']
|
||||
failure_counts = {'aka': 0, 'type': 0, 'group': 0, 'lat': 0, 'lon': 0}
|
||||
|
||||
for idx, record in enumerate(sf.records()):
|
||||
aka_val = str(record[aka_idx]) if record[aka_idx] else ""
|
||||
if not re.match(r'^[A-Z]{3}_[A-Z]', aka_val):
|
||||
failure_counts['aka'] += 1
|
||||
if failure_counts['aka'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} AKA must begin with YYY_Y format")
|
||||
|
||||
if record[type_idx] not in valid_types:
|
||||
failure_counts['type'] += 1
|
||||
if failure_counts['type'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} has invalid Type")
|
||||
|
||||
group_val = str(record[group_idx]) if record[group_idx] else ""
|
||||
if not re.match(r'^Zone \d{2}$', group_val):
|
||||
failure_counts['group'] += 1
|
||||
if failure_counts['group'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} has invalid Group 1 format")
|
||||
|
||||
try:
|
||||
float(record[lat_idx])
|
||||
except (ValueError, TypeError):
|
||||
failure_counts['lat'] += 1
|
||||
if failure_counts['lat'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} Latitude is not a number")
|
||||
|
||||
try:
|
||||
float(record[lon_idx])
|
||||
except (ValueError, TypeError):
|
||||
failure_counts['lon'] += 1
|
||||
if failure_counts['lon'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} Longitude is not a number")
|
||||
|
||||
for key, count in failure_counts.items():
|
||||
if count > 10:
|
||||
errors.append(f"{shapefile_name}: 10 or more features failed {key} validation")
|
||||
|
||||
sf.close()
|
||||
except Exception as e:
|
||||
errors.append(f"{shapefile_name}: Error validating attributes - {str(e)}")
|
||||
|
||||
return errors
|
||||
|
||||
|
||||
def validate_network_elements(shp_path: Path, shapefile_name: str):
|
||||
"""Validate network_elements shapefile attributes"""
|
||||
errors = []
|
||||
|
||||
try:
|
||||
sf = shapefile.Reader(str(shp_path))
|
||||
field_names = [field[0] for field in sf.fields[1:]]
|
||||
|
||||
required_fields = ['Type', 'Group 1', 'Latitude', 'Longitude']
|
||||
for field in required_fields:
|
||||
if field not in field_names:
|
||||
errors.append(f"{shapefile_name}: Missing required field '{field}'")
|
||||
return errors
|
||||
|
||||
type_idx = field_names.index('Type')
|
||||
group_idx = field_names.index('Group 1')
|
||||
lat_idx = field_names.index('Latitude')
|
||||
lon_idx = field_names.index('Longitude')
|
||||
|
||||
valid_types = ['Slack Coil', 'Anchor', 'Bore Pit', 'Riser']
|
||||
failure_counts = {'type': 0, 'group': 0, 'lat': 0, 'lon': 0}
|
||||
|
||||
for idx, record in enumerate(sf.records()):
|
||||
if record[type_idx] not in valid_types:
|
||||
failure_counts['type'] += 1
|
||||
if failure_counts['type'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} has invalid Type")
|
||||
|
||||
group_val = str(record[group_idx]) if record[group_idx] else ""
|
||||
if not re.match(r'^Zone \d{2}$', group_val):
|
||||
failure_counts['group'] += 1
|
||||
if failure_counts['group'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} has invalid Group 1 format")
|
||||
|
||||
try:
|
||||
float(record[lat_idx])
|
||||
except (ValueError, TypeError):
|
||||
failure_counts['lat'] += 1
|
||||
if failure_counts['lat'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} Latitude is not a number")
|
||||
|
||||
try:
|
||||
float(record[lon_idx])
|
||||
except (ValueError, TypeError):
|
||||
failure_counts['lon'] += 1
|
||||
if failure_counts['lon'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} Longitude is not a number")
|
||||
|
||||
for key, count in failure_counts.items():
|
||||
if count > 10:
|
||||
errors.append(f"{shapefile_name}: 10 or more features failed {key} validation")
|
||||
|
||||
sf.close()
|
||||
except Exception as e:
|
||||
errors.append(f"{shapefile_name}: Error validating attributes - {str(e)}")
|
||||
|
||||
return errors
|
||||
|
||||
|
||||
def validate_poles(shp_path: Path, shapefile_name: str):
|
||||
"""Validate poles shapefile attributes"""
|
||||
errors = []
|
||||
|
||||
try:
|
||||
sf = shapefile.Reader(str(shp_path))
|
||||
field_names = [field[0] for field in sf.fields[1:]]
|
||||
|
||||
required_fields = ['Pole Tag', 'Pole Owner', 'Group 1', 'Latitude', 'Longitude']
|
||||
for field in required_fields:
|
||||
if field not in field_names:
|
||||
errors.append(f"{shapefile_name}: Missing required field '{field}'")
|
||||
return errors
|
||||
|
||||
tag_idx = field_names.index('Pole Tag')
|
||||
owner_idx = field_names.index('Pole Owner')
|
||||
group_idx = field_names.index('Group 1')
|
||||
lat_idx = field_names.index('Latitude')
|
||||
lon_idx = field_names.index('Longitude')
|
||||
|
||||
failure_counts = {'tag': 0, 'owner': 0, 'group': 0, 'lat': 0, 'lon': 0}
|
||||
|
||||
for idx, record in enumerate(sf.records()):
|
||||
if not record[tag_idx] or str(record[tag_idx]).strip() == '':
|
||||
failure_counts['tag'] += 1
|
||||
if failure_counts['tag'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} 'Pole Tag' must be populated")
|
||||
|
||||
if not record[owner_idx] or str(record[owner_idx]).strip() == '':
|
||||
failure_counts['owner'] += 1
|
||||
if failure_counts['owner'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} 'Pole Owner' must be populated")
|
||||
|
||||
group_val = str(record[group_idx]) if record[group_idx] else ""
|
||||
if not re.match(r'^Zone \d{2}$', group_val):
|
||||
failure_counts['group'] += 1
|
||||
if failure_counts['group'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} has invalid Group 1 format")
|
||||
|
||||
try:
|
||||
float(record[lat_idx])
|
||||
except (ValueError, TypeError):
|
||||
failure_counts['lat'] += 1
|
||||
if failure_counts['lat'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} Latitude is not a number")
|
||||
|
||||
try:
|
||||
float(record[lon_idx])
|
||||
except (ValueError, TypeError):
|
||||
failure_counts['lon'] += 1
|
||||
if failure_counts['lon'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature {idx} Longitude is not a number")
|
||||
|
||||
for key, count in failure_counts.items():
|
||||
if count > 10:
|
||||
errors.append(f"{shapefile_name}: 10 or more features failed {key} validation")
|
||||
|
||||
sf.close()
|
||||
except Exception as e:
|
||||
errors.append(f"{shapefile_name}: Error validating attributes - {str(e)}")
|
||||
|
||||
return errors
|
||||
|
||||
|
||||
def point_in_polygon(point, polygon):
|
||||
"""Check if a point is inside a polygon using ray casting algorithm"""
|
||||
x, y = point
|
||||
n = len(polygon)
|
||||
inside = False
|
||||
|
||||
p1x, p1y = polygon[0]
|
||||
for i in range(n + 1):
|
||||
p2x, p2y = polygon[i % n]
|
||||
if y > min(p1y, p2y):
|
||||
if y <= max(p1y, p2y):
|
||||
if x <= max(p1x, p2x):
|
||||
if p1y != p2y:
|
||||
xinters = (y - p1y) * (p2x - p1x) / (p2y - p1y) + p1x
|
||||
if p1x == p2x or x <= xinters:
|
||||
inside = not inside
|
||||
p1x, p1y = p2x, p2y
|
||||
|
||||
return inside
|
||||
|
||||
|
||||
def line_crosses_polygon_boundary(line_points, polygon):
|
||||
"""Check if a line crosses a polygon boundary (for segment exception)"""
|
||||
# Check if line has points both inside and outside the polygon
|
||||
points_inside = sum(1 for point in line_points if point_in_polygon(point, polygon))
|
||||
points_outside = len(line_points) - points_inside
|
||||
|
||||
return points_inside > 0 and points_outside > 0
|
||||
|
||||
|
||||
def extract_zone_number(field_value):
|
||||
"""Extract 2-digit zone number from field value like 'Zone 07' or 'Zone 07 Boundary'"""
|
||||
match = re.search(r'Zone (\d{2})', str(field_value))
|
||||
if match:
|
||||
return match.group(1)
|
||||
return None
|
||||
|
||||
|
||||
def validate_spatial_containment(shapefile_dir: Path):
|
||||
"""Validate that features are within their correct cabinet boundaries"""
|
||||
errors = []
|
||||
|
||||
try:
|
||||
# Load cabinet boundaries
|
||||
cabinet_path = shapefile_dir / "cabinet_boundaries.shp"
|
||||
cabinet_sf = shapefile.Reader(str(cabinet_path))
|
||||
cabinet_records = cabinet_sf.records()
|
||||
cabinet_shapes = cabinet_sf.shapes()
|
||||
cabinet_fields = [field[0] for field in cabinet_sf.fields[1:]]
|
||||
|
||||
if 'Name' not in cabinet_fields:
|
||||
errors.append("cabinet_boundaries: Missing 'Name' field for spatial validation")
|
||||
return errors
|
||||
|
||||
name_idx = cabinet_fields.index('Name')
|
||||
|
||||
# Build cabinet boundary data structure
|
||||
cabinets = []
|
||||
for idx, (record, shape) in enumerate(zip(cabinet_records, cabinet_shapes)):
|
||||
zone_num = extract_zone_number(record[name_idx])
|
||||
if zone_num:
|
||||
# Handle polygon or multipolygon
|
||||
if shape.shapeType == 5: # Polygon
|
||||
polygon = shape.points
|
||||
cabinets.append({'zone': zone_num, 'polygon': polygon})
|
||||
elif shape.shapeType == 15: # PolygonZ
|
||||
polygon = shape.points
|
||||
cabinets.append({'zone': zone_num, 'polygon': polygon})
|
||||
|
||||
if not cabinets:
|
||||
errors.append("cabinet_boundaries: No valid cabinet boundaries found with zone numbers")
|
||||
return errors
|
||||
|
||||
# Validate each feature type
|
||||
feature_types = [
|
||||
('sites', True, False), # (shapefile_name, has_group1, is_line)
|
||||
('access_points', True, False),
|
||||
('permits', False, False), # permits don't have Group 1, will skip zone matching
|
||||
('splicing', True, False),
|
||||
('network_elements', True, False),
|
||||
('poles', True, False),
|
||||
('segments', True, True), # segments are lines
|
||||
]
|
||||
|
||||
for shapefile_name, has_group1, is_line in feature_types:
|
||||
shp_path = shapefile_dir / f"{shapefile_name}.shp"
|
||||
|
||||
try:
|
||||
sf = shapefile.Reader(str(shp_path))
|
||||
records = sf.records()
|
||||
shapes = sf.shapes()
|
||||
field_names = [field[0] for field in sf.fields[1:]]
|
||||
|
||||
# Get Group 1 field index if it exists
|
||||
group1_idx = None
|
||||
if has_group1 and 'Group 1' in field_names:
|
||||
group1_idx = field_names.index('Group 1')
|
||||
|
||||
# Get UID for error reporting
|
||||
uid_idx = field_names.index('UID') if 'UID' in field_names else None
|
||||
|
||||
failure_counts = {'wrong_zone': 0, 'outside_all': 0}
|
||||
|
||||
for idx, (record, shape) in enumerate(zip(records, shapes)):
|
||||
uid = record[uid_idx] if uid_idx is not None else idx
|
||||
|
||||
# Get feature zone number
|
||||
feature_zone = None
|
||||
if group1_idx is not None:
|
||||
feature_zone = extract_zone_number(record[group1_idx])
|
||||
|
||||
# Get feature geometry
|
||||
if is_line:
|
||||
# For segments, get all points
|
||||
feature_points = shape.points
|
||||
else:
|
||||
# For points, get the first point
|
||||
if len(shape.points) > 0:
|
||||
feature_points = [shape.points[0]]
|
||||
else:
|
||||
continue
|
||||
|
||||
# Check if feature is in any cabinet boundary
|
||||
in_any_cabinet = False
|
||||
in_correct_cabinet = False
|
||||
crosses_boundary = False
|
||||
|
||||
for cabinet in cabinets:
|
||||
if is_line:
|
||||
# Check if line crosses this boundary
|
||||
if line_crosses_polygon_boundary(feature_points, cabinet['polygon']):
|
||||
crosses_boundary = True
|
||||
break
|
||||
# Check if any point is in this cabinet
|
||||
for point in feature_points:
|
||||
if point_in_polygon(point, cabinet['polygon']):
|
||||
in_any_cabinet = True
|
||||
if feature_zone == cabinet['zone']:
|
||||
in_correct_cabinet = True
|
||||
break
|
||||
else:
|
||||
# For points, check if in this cabinet
|
||||
if point_in_polygon(feature_points[0], cabinet['polygon']):
|
||||
in_any_cabinet = True
|
||||
if feature_zone == cabinet['zone']:
|
||||
in_correct_cabinet = True
|
||||
break
|
||||
|
||||
# Exception for segments that cross boundaries
|
||||
if shapefile_name == 'segments' and crosses_boundary:
|
||||
continue
|
||||
|
||||
# Check if feature is outside all cabinets
|
||||
if not in_any_cabinet:
|
||||
failure_counts['outside_all'] += 1
|
||||
if failure_counts['outside_all'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature UID {uid} is outside all cabinet boundaries")
|
||||
|
||||
# Check if feature is in wrong zone (only if has Group 1 field)
|
||||
elif has_group1 and not in_correct_cabinet and feature_zone:
|
||||
failure_counts['wrong_zone'] += 1
|
||||
if failure_counts['wrong_zone'] <= 10:
|
||||
errors.append(f"{shapefile_name}: Feature UID {uid} is in wrong cabinet boundary (expected Zone {feature_zone})")
|
||||
|
||||
if failure_counts['outside_all'] > 10:
|
||||
errors.append(f"{shapefile_name}: 10 or more features failed outside all cabinet boundaries validation")
|
||||
|
||||
if failure_counts['wrong_zone'] > 10:
|
||||
errors.append(f"{shapefile_name}: 10 or more features failed wrong cabinet boundary validation")
|
||||
|
||||
sf.close()
|
||||
|
||||
except Exception as e:
|
||||
errors.append(f"{shapefile_name}: Error during spatial validation - {str(e)}")
|
||||
|
||||
cabinet_sf.close()
|
||||
|
||||
except Exception as e:
|
||||
errors.append(f"Spatial validation error: {str(e)}")
|
||||
|
||||
return errors
|
||||
4
backend/requirements.txt
Normal file
4
backend/requirements.txt
Normal file
@ -0,0 +1,4 @@
|
||||
fastapi==0.104.1
|
||||
uvicorn==0.24.0
|
||||
python-multipart==0.0.6
|
||||
pyshp==2.3.1
|
||||
BIN
frontend/celebrate.png
Normal file
BIN
frontend/celebrate.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 2.0 MiB |
28
frontend/index.html
Normal file
28
frontend/index.html
Normal file
@ -0,0 +1,28 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>Verofy Shapefile Upload</title>
|
||||
<link rel="stylesheet" href="style.css?v=3">
|
||||
</head>
|
||||
<body>
|
||||
<div class="main-container">
|
||||
<div class="logo-section">
|
||||
<img src="logo.png" alt="Vero Dragndrop HLD Logo" class="logo">
|
||||
</div>
|
||||
<div class="upload-section">
|
||||
<div class="container">
|
||||
<h1>Verofy HLD Shapefile Upload</h1>
|
||||
<div id="drop-area">
|
||||
<p>Drag & Drop your ZIP file here</p>
|
||||
<input type="file" id="fileElem" accept=".zip" />
|
||||
</div>
|
||||
<div id="result"></div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script src="upload.js?v=3"></script>
|
||||
</body>
|
||||
</html>
|
||||
BIN
frontend/logo.png
Normal file
BIN
frontend/logo.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 2.2 MiB |
114
frontend/style.css
Normal file
114
frontend/style.css
Normal file
@ -0,0 +1,114 @@
|
||||
* {
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
body {
|
||||
font-family: Arial, sans-serif;
|
||||
height: 100vh;
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
.main-container {
|
||||
display: flex;
|
||||
height: 100vh;
|
||||
}
|
||||
|
||||
.logo-section {
|
||||
flex: 0 0 45%;
|
||||
background: linear-gradient(135deg, #001f3f 0%, #003366 100%);
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
padding: 60px;
|
||||
}
|
||||
|
||||
.logo {
|
||||
width: 85%;
|
||||
height: auto;
|
||||
object-fit: contain;
|
||||
}
|
||||
|
||||
.upload-section {
|
||||
flex: 0 0 55%;
|
||||
background: #e8e8e8;
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.container {
|
||||
text-align: center;
|
||||
padding: 40px;
|
||||
width: 100%;
|
||||
max-width: 500px;
|
||||
}
|
||||
|
||||
h1 {
|
||||
color: #333;
|
||||
margin-bottom: 30px;
|
||||
font-size: 24px;
|
||||
}
|
||||
|
||||
#drop-area {
|
||||
border: 2px dashed #007BFF;
|
||||
border-radius: 10px;
|
||||
width: 100%;
|
||||
min-height: 200px;
|
||||
margin: 20px auto;
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
flex-direction: column;
|
||||
background: #fff;
|
||||
cursor: pointer;
|
||||
padding: 40px 20px;
|
||||
transition: all 0.3s ease;
|
||||
}
|
||||
|
||||
#drop-area:hover {
|
||||
border-color: #0056b3;
|
||||
background: #f8f9fa;
|
||||
}
|
||||
|
||||
#drop-area.highlight {
|
||||
border-color: #0056b3;
|
||||
background: #e7f3ff;
|
||||
}
|
||||
|
||||
#drop-area p {
|
||||
margin-bottom: 15px;
|
||||
color: #666;
|
||||
font-size: 16px;
|
||||
}
|
||||
|
||||
.button {
|
||||
margin-top: 10px;
|
||||
background: #007BFF;
|
||||
color: #fff;
|
||||
padding: 10px 20px;
|
||||
border-radius: 5px;
|
||||
cursor: pointer;
|
||||
border: none;
|
||||
}
|
||||
|
||||
#result {
|
||||
margin-top: 20px;
|
||||
font-weight: bold;
|
||||
}
|
||||
|
||||
/* Responsive design */
|
||||
@media (max-width: 768px) {
|
||||
.main-container {
|
||||
flex-direction: column;
|
||||
}
|
||||
|
||||
.logo-section {
|
||||
flex: 0 0 30%;
|
||||
}
|
||||
|
||||
.upload-section {
|
||||
flex: 0 0 70%;
|
||||
}
|
||||
}
|
||||
152
frontend/upload.js
Normal file
152
frontend/upload.js
Normal file
@ -0,0 +1,152 @@
|
||||
const dropArea = document.getElementById('drop-area')
|
||||
const fileInput = document.getElementById('fileElem')
|
||||
|
||||
// Prevent default drag behaviors
|
||||
;['dragenter', 'dragover', 'dragleave', 'drop'].forEach(eventName => {
|
||||
dropArea.addEventListener(eventName, preventDefaults, false)
|
||||
})
|
||||
|
||||
function preventDefaults(e) {
|
||||
e.preventDefault()
|
||||
e.stopPropagation()
|
||||
}
|
||||
|
||||
// Highlight drop area on dragover
|
||||
;['dragenter', 'dragover'].forEach(eventName => {
|
||||
dropArea.addEventListener(eventName, () => dropArea.classList.add('highlight'), false)
|
||||
})
|
||||
|
||||
;['dragleave', 'drop'].forEach(eventName => {
|
||||
dropArea.addEventListener(eventName, () => dropArea.classList.remove('highlight'), false)
|
||||
})
|
||||
|
||||
// Handle dropped files
|
||||
dropArea.addEventListener('drop', e => {
|
||||
const dt = e.dataTransfer
|
||||
const files = dt.files
|
||||
handleFiles(files)
|
||||
})
|
||||
|
||||
// Handle selected files from input
|
||||
fileInput.addEventListener('change', e => {
|
||||
handleFiles(fileInput.files)
|
||||
})
|
||||
|
||||
function handleFiles(files) {
|
||||
if (!files || files.length === 0) return
|
||||
const file = files[0]
|
||||
|
||||
if (!file.name.endsWith('.zip')) {
|
||||
alert('Please upload a ZIP file.')
|
||||
return
|
||||
}
|
||||
|
||||
uploadFile(file)
|
||||
}
|
||||
|
||||
function uploadFile(file) {
|
||||
const url = 'http://localhost:8000/upload'
|
||||
const formData = new FormData()
|
||||
formData.append('file', file)
|
||||
|
||||
dropArea.innerHTML = `<p>Uploading ${file.name}...</p>`
|
||||
|
||||
fetch(url, {
|
||||
method: 'POST',
|
||||
body: formData
|
||||
})
|
||||
.then(async response => {
|
||||
const contentType = response.headers.get('Content-Type')
|
||||
|
||||
// Check if content type includes text/plain (for QC failure report)
|
||||
if (contentType && contentType.includes('text/plain')) {
|
||||
// QC Failed - download report
|
||||
const blob = await response.blob()
|
||||
const link = document.createElement('a')
|
||||
link.href = window.URL.createObjectURL(blob)
|
||||
link.download = 'QC_report.txt'
|
||||
link.click()
|
||||
dropArea.innerHTML = `<p>QC failed. Report downloaded.</p>`
|
||||
} else if (contentType && contentType.includes('application/json')) {
|
||||
// QC Passed - show success and VerofyMapID input
|
||||
const data = await response.json()
|
||||
if (data.message === 'success') {
|
||||
showVerofyMapIdInput()
|
||||
} else {
|
||||
dropArea.innerHTML = `<p>Unexpected response from server.</p>`
|
||||
}
|
||||
} else {
|
||||
// Unknown response type, try to handle as text
|
||||
const text = await response.text()
|
||||
dropArea.innerHTML = `<p>Unexpected response: ${text.substring(0, 100)}</p>`
|
||||
}
|
||||
})
|
||||
.catch((error) => {
|
||||
console.error('Upload error:', error)
|
||||
dropArea.innerHTML = `<p>Upload failed. Check console.</p>`
|
||||
})
|
||||
}
|
||||
|
||||
function showVerofyMapIdInput() {
|
||||
dropArea.innerHTML = `
|
||||
<div style="padding: 20px;">
|
||||
<p style="color: green; font-weight: bold; margin-bottom: 15px;">
|
||||
Your files have passed QC!
|
||||
</p>
|
||||
<p style="margin-bottom: 10px;">Please provide VerofyMapID:</p>
|
||||
<input type="number" id="verofyMapId" placeholder="Enter Map ID"
|
||||
style="padding: 8px; width: 200px; margin-bottom: 10px; border: 1px solid #ccc; border-radius: 4px;" />
|
||||
<br/>
|
||||
<button onclick="submitMapId()"
|
||||
style="padding: 10px 20px; background: #007BFF; color: white; border: none; border-radius: 5px; cursor: pointer;">
|
||||
Submit
|
||||
</button>
|
||||
</div>
|
||||
`
|
||||
}
|
||||
|
||||
function submitMapId() {
|
||||
const mapIdInput = document.getElementById('verofyMapId')
|
||||
const mapId = mapIdInput.value
|
||||
|
||||
if (!mapId || mapId.trim() === '') {
|
||||
alert('Please enter a VerofyMapID')
|
||||
return
|
||||
}
|
||||
|
||||
// Update the drop area to show success message
|
||||
dropArea.innerHTML = `
|
||||
<p style="color: green; font-weight: bold;">
|
||||
Success! VerofyMapID ${mapId} received.
|
||||
</p>
|
||||
`
|
||||
|
||||
// Create overlay with celebration image
|
||||
const overlay = document.createElement('div')
|
||||
overlay.id = 'celebrationOverlay'
|
||||
overlay.style.cssText = `
|
||||
position: fixed;
|
||||
top: 0;
|
||||
left: 0;
|
||||
width: 100vw;
|
||||
height: 100vh;
|
||||
background: rgba(0, 0, 0, 0.7);
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
z-index: 10000;
|
||||
cursor: pointer;
|
||||
`
|
||||
|
||||
overlay.innerHTML = `
|
||||
<img src="celebrate.png" alt="Celebration"
|
||||
style="max-width: 80%; max-height: 80%; object-fit: contain;" />
|
||||
`
|
||||
|
||||
// Remove overlay on click
|
||||
overlay.addEventListener('click', () => {
|
||||
overlay.remove()
|
||||
})
|
||||
|
||||
document.body.appendChild(overlay)
|
||||
}
|
||||
5
oldqc/.gitignore
vendored
Normal file
5
oldqc/.gitignore
vendored
Normal file
@ -0,0 +1,5 @@
|
||||
db/
|
||||
Backend/server.exe
|
||||
Backend/server
|
||||
Backend/tmp/
|
||||
Backend/build-errors.log
|
||||
3
oldqc/.gitignore:Zone.Identifier
Normal file
3
oldqc/.gitignore:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
44
oldqc/Backend/.air.toml
Normal file
44
oldqc/Backend/.air.toml
Normal file
@ -0,0 +1,44 @@
|
||||
root = "."
|
||||
testdata_dir = "testdata"
|
||||
tmp_dir = "tmp"
|
||||
|
||||
[build]
|
||||
args_bin = []
|
||||
bin = "tmp\\main.exe"
|
||||
cmd = "go build -o ./tmp/main.exe ."
|
||||
delay = 1000
|
||||
exclude_dir = ["assets", "tmp", "vendor", "testdata"]
|
||||
exclude_file = []
|
||||
exclude_regex = ["_test.go"]
|
||||
exclude_unchanged = false
|
||||
follow_symlink = false
|
||||
full_bin = ""
|
||||
include_dir = []
|
||||
include_ext = ["go", "tpl", "tmpl", "html"]
|
||||
include_file = []
|
||||
kill_delay = "0s"
|
||||
log = "build-errors.log"
|
||||
poll = false
|
||||
poll_interval = 0
|
||||
rerun = false
|
||||
rerun_delay = 500
|
||||
send_interrupt = false
|
||||
stop_on_error = false
|
||||
|
||||
[color]
|
||||
app = ""
|
||||
build = "yellow"
|
||||
main = "magenta"
|
||||
runner = "green"
|
||||
watcher = "cyan"
|
||||
|
||||
[log]
|
||||
main_only = false
|
||||
time = false
|
||||
|
||||
[misc]
|
||||
clean_on_exit = false
|
||||
|
||||
[screen]
|
||||
clear_on_rebuild = false
|
||||
keep_scroll = true
|
||||
3
oldqc/Backend/.air.toml:Zone.Identifier
Normal file
3
oldqc/Backend/.air.toml:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
11
oldqc/Backend/.env
Normal file
11
oldqc/Backend/.env
Normal file
@ -0,0 +1,11 @@
|
||||
DB_HOST=bomar.cloud
|
||||
DB_USER=ospe
|
||||
DB_PASS=R5TU8Ml8KHE05LKdMvwulJl0VOeQwUCUMXQrMMqXb10=
|
||||
DB_NAME=vero
|
||||
DB_PORT=5432
|
||||
SCHEMA_NAME=eli_test
|
||||
SEGMENT_TABLE=segment2
|
||||
ZONE_COLUMN=group_1
|
||||
MAPID_COLUMN=mapid
|
||||
ID_COLUMN=id
|
||||
QCFLAG_COLUMN=qc_flag
|
||||
3
oldqc/Backend/.env:Zone.Identifier
Normal file
3
oldqc/Backend/.env:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
4
oldqc/Backend/build.bat
Normal file
4
oldqc/Backend/build.bat
Normal file
@ -0,0 +1,4 @@
|
||||
@echo off
|
||||
echo Building server...
|
||||
go build -o server.exe main.go
|
||||
echo Build complete! Run with: server.exe
|
||||
3
oldqc/Backend/build.bat:Zone.Identifier
Normal file
3
oldqc/Backend/build.bat:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
48
oldqc/Backend/go.mod
Normal file
48
oldqc/Backend/go.mod
Normal file
@ -0,0 +1,48 @@
|
||||
module verofy-backend
|
||||
|
||||
go 1.24.3
|
||||
|
||||
require (
|
||||
github.com/gin-contrib/cors v1.7.6
|
||||
github.com/gin-gonic/gin v1.10.1
|
||||
gorm.io/driver/postgres v1.6.0
|
||||
gorm.io/gorm v1.30.0
|
||||
)
|
||||
|
||||
require (
|
||||
github.com/bytedance/sonic v1.13.3 // indirect
|
||||
github.com/bytedance/sonic/loader v0.2.4 // indirect
|
||||
github.com/cloudwego/base64x v0.1.5 // indirect
|
||||
github.com/gabriel-vasile/mimetype v1.4.9 // indirect
|
||||
github.com/gin-contrib/sse v1.1.0 // indirect
|
||||
github.com/go-playground/locales v0.14.1 // indirect
|
||||
github.com/go-playground/universal-translator v0.18.1 // indirect
|
||||
github.com/go-playground/validator/v10 v10.26.0 // indirect
|
||||
github.com/goccy/go-json v0.10.5 // indirect
|
||||
github.com/jackc/pgpassfile v1.0.0 // indirect
|
||||
github.com/jackc/pgservicefile v0.0.0-20240606120523-5a60cdf6a761 // indirect
|
||||
github.com/jackc/pgx/v5 v5.6.0 // indirect
|
||||
github.com/jackc/puddle/v2 v2.2.2 // indirect
|
||||
github.com/jinzhu/inflection v1.0.0 // indirect
|
||||
github.com/jinzhu/now v1.1.5 // indirect
|
||||
github.com/joho/godotenv v1.5.1
|
||||
github.com/json-iterator/go v1.1.12 // indirect
|
||||
github.com/klauspost/cpuid/v2 v2.2.10 // indirect
|
||||
github.com/kr/text v0.2.0 // indirect
|
||||
github.com/leodido/go-urn v1.4.0 // indirect
|
||||
github.com/mattn/go-isatty v0.0.20 // indirect
|
||||
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd // indirect
|
||||
github.com/modern-go/reflect2 v1.0.2 // indirect
|
||||
github.com/pelletier/go-toml/v2 v2.2.4 // indirect
|
||||
github.com/rogpeppe/go-internal v1.14.1 // indirect
|
||||
github.com/twitchyliquid64/golang-asm v0.15.1 // indirect
|
||||
github.com/ugorji/go/codec v1.3.0 // indirect
|
||||
golang.org/x/arch v0.18.0 // indirect
|
||||
golang.org/x/crypto v0.39.0 // indirect
|
||||
golang.org/x/net v0.41.0 // indirect
|
||||
golang.org/x/sync v0.15.0 // indirect
|
||||
golang.org/x/sys v0.33.0 // indirect
|
||||
golang.org/x/text v0.26.0 // indirect
|
||||
google.golang.org/protobuf v1.36.6 // indirect
|
||||
gopkg.in/yaml.v3 v3.0.1 // indirect
|
||||
)
|
||||
3
oldqc/Backend/go.mod:Zone.Identifier
Normal file
3
oldqc/Backend/go.mod:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
112
oldqc/Backend/go.sum
Normal file
112
oldqc/Backend/go.sum
Normal file
@ -0,0 +1,112 @@
|
||||
github.com/bytedance/sonic v1.13.3 h1:MS8gmaH16Gtirygw7jV91pDCN33NyMrPbN7qiYhEsF0=
|
||||
github.com/bytedance/sonic v1.13.3/go.mod h1:o68xyaF9u2gvVBuGHPlUVCy+ZfmNNO5ETf1+KgkJhz4=
|
||||
github.com/bytedance/sonic/loader v0.1.1/go.mod h1:ncP89zfokxS5LZrJxl5z0UJcsk4M4yY2JpfqGeCtNLU=
|
||||
github.com/bytedance/sonic/loader v0.2.4 h1:ZWCw4stuXUsn1/+zQDqeE7JKP+QO47tz7QCNan80NzY=
|
||||
github.com/bytedance/sonic/loader v0.2.4/go.mod h1:N8A3vUdtUebEY2/VQC0MyhYeKUFosQU6FxH2JmUe6VI=
|
||||
github.com/cloudwego/base64x v0.1.5 h1:XPciSp1xaq2VCSt6lF0phncD4koWyULpl5bUxbfCyP4=
|
||||
github.com/cloudwego/base64x v0.1.5/go.mod h1:0zlkT4Wn5C6NdauXdJRhSKRlJvmclQ1hhJgA0rcu/8w=
|
||||
github.com/cloudwego/iasm v0.2.0/go.mod h1:8rXZaNYT2n95jn+zTI1sDr+IgcD2GVs0nlbbQPiEFhY=
|
||||
github.com/creack/pty v1.1.9/go.mod h1:oKZEueFk5CKHvIhNR5MUki03XCEU+Q6VDXinZuGJ33E=
|
||||
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
|
||||
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||
github.com/gabriel-vasile/mimetype v1.4.9 h1:5k+WDwEsD9eTLL8Tz3L0VnmVh9QxGjRmjBvAG7U/oYY=
|
||||
github.com/gabriel-vasile/mimetype v1.4.9/go.mod h1:WnSQhFKJuBlRyLiKohA/2DtIlPFAbguNaG7QCHcyGok=
|
||||
github.com/gin-contrib/cors v1.7.6 h1:3gQ8GMzs1Ylpf70y8bMw4fVpycXIeX1ZemuSQIsnQQY=
|
||||
github.com/gin-contrib/cors v1.7.6/go.mod h1:Ulcl+xN4jel9t1Ry8vqph23a60FwH9xVLd+3ykmTjOk=
|
||||
github.com/gin-contrib/sse v1.1.0 h1:n0w2GMuUpWDVp7qSpvze6fAu9iRxJY4Hmj6AmBOU05w=
|
||||
github.com/gin-contrib/sse v1.1.0/go.mod h1:hxRZ5gVpWMT7Z0B0gSNYqqsSCNIJMjzvm6fqCz9vjwM=
|
||||
github.com/gin-gonic/gin v1.10.1 h1:T0ujvqyCSqRopADpgPgiTT63DUQVSfojyME59Ei63pQ=
|
||||
github.com/gin-gonic/gin v1.10.1/go.mod h1:4PMNQiOhvDRa013RKVbsiNwoyezlm2rm0uX/T7kzp5Y=
|
||||
github.com/go-playground/assert/v2 v2.2.0 h1:JvknZsQTYeFEAhQwI4qEt9cyV5ONwRHC+lYKSsYSR8s=
|
||||
github.com/go-playground/assert/v2 v2.2.0/go.mod h1:VDjEfimB/XKnb+ZQfWdccd7VUvScMdVu0Titje2rxJ4=
|
||||
github.com/go-playground/locales v0.14.1 h1:EWaQ/wswjilfKLTECiXz7Rh+3BjFhfDFKv/oXslEjJA=
|
||||
github.com/go-playground/locales v0.14.1/go.mod h1:hxrqLVvrK65+Rwrd5Fc6F2O76J/NuW9t0sjnWqG1slY=
|
||||
github.com/go-playground/universal-translator v0.18.1 h1:Bcnm0ZwsGyWbCzImXv+pAJnYK9S473LQFuzCbDbfSFY=
|
||||
github.com/go-playground/universal-translator v0.18.1/go.mod h1:xekY+UJKNuX9WP91TpwSH2VMlDf28Uj24BCp08ZFTUY=
|
||||
github.com/go-playground/validator/v10 v10.26.0 h1:SP05Nqhjcvz81uJaRfEV0YBSSSGMc/iMaVtFbr3Sw2k=
|
||||
github.com/go-playground/validator/v10 v10.26.0/go.mod h1:I5QpIEbmr8On7W0TktmJAumgzX4CA1XNl4ZmDuVHKKo=
|
||||
github.com/goccy/go-json v0.10.5 h1:Fq85nIqj+gXn/S5ahsiTlK3TmC85qgirsdTP/+DeaC4=
|
||||
github.com/goccy/go-json v0.10.5/go.mod h1:oq7eo15ShAhp70Anwd5lgX2pLfOS3QCiwU/PULtXL6M=
|
||||
github.com/google/go-cmp v0.7.0 h1:wk8382ETsv4JYUZwIsn6YpYiWiBsYLSJiTsyBybVuN8=
|
||||
github.com/google/go-cmp v0.7.0/go.mod h1:pXiqmnSA92OHEEa9HXL2W4E7lf9JzCmGVUdgjX3N/iU=
|
||||
github.com/google/gofuzz v1.0.0/go.mod h1:dBl0BpW6vV/+mYPU4Po3pmUjxk6FQPldtuIdl/M65Eg=
|
||||
github.com/jackc/pgpassfile v1.0.0 h1:/6Hmqy13Ss2zCq62VdNG8tM1wchn8zjSGOBJ6icpsIM=
|
||||
github.com/jackc/pgpassfile v1.0.0/go.mod h1:CEx0iS5ambNFdcRtxPj5JhEz+xB6uRky5eyVu/W2HEg=
|
||||
github.com/jackc/pgservicefile v0.0.0-20240606120523-5a60cdf6a761 h1:iCEnooe7UlwOQYpKFhBabPMi4aNAfoODPEFNiAnClxo=
|
||||
github.com/jackc/pgservicefile v0.0.0-20240606120523-5a60cdf6a761/go.mod h1:5TJZWKEWniPve33vlWYSoGYefn3gLQRzjfDlhSJ9ZKM=
|
||||
github.com/jackc/pgx/v5 v5.6.0 h1:SWJzexBzPL5jb0GEsrPMLIsi/3jOo7RHlzTjcAeDrPY=
|
||||
github.com/jackc/pgx/v5 v5.6.0/go.mod h1:DNZ/vlrUnhWCoFGxHAG8U2ljioxukquj7utPDgtQdTw=
|
||||
github.com/jackc/puddle/v2 v2.2.2 h1:PR8nw+E/1w0GLuRFSmiioY6UooMp6KJv0/61nB7icHo=
|
||||
github.com/jackc/puddle/v2 v2.2.2/go.mod h1:vriiEXHvEE654aYKXXjOvZM39qJ0q+azkZFrfEOc3H4=
|
||||
github.com/jinzhu/inflection v1.0.0 h1:K317FqzuhWc8YvSVlFMCCUb36O/S9MCKRDI7QkRKD/E=
|
||||
github.com/jinzhu/inflection v1.0.0/go.mod h1:h+uFLlag+Qp1Va5pdKtLDYj+kHp5pxUVkryuEj+Srlc=
|
||||
github.com/jinzhu/now v1.1.5 h1:/o9tlHleP7gOFmsnYNz3RGnqzefHA47wQpKrrdTIwXQ=
|
||||
github.com/jinzhu/now v1.1.5/go.mod h1:d3SSVoowX0Lcu0IBviAWJpolVfI5UJVZZ7cO71lE/z8=
|
||||
github.com/joho/godotenv v1.5.1 h1:7eLL/+HRGLY0ldzfGMeQkb7vMd0as4CfYvUVzLqw0N0=
|
||||
github.com/joho/godotenv v1.5.1/go.mod h1:f4LDr5Voq0i2e/R5DDNOoa2zzDfwtkZa6DnEwAbqwq4=
|
||||
github.com/json-iterator/go v1.1.12 h1:PV8peI4a0ysnczrg+LtxykD8LfKY9ML6u2jnxaEnrnM=
|
||||
github.com/json-iterator/go v1.1.12/go.mod h1:e30LSqwooZae/UwlEbR2852Gd8hjQvJoHmT4TnhNGBo=
|
||||
github.com/klauspost/cpuid/v2 v2.0.9/go.mod h1:FInQzS24/EEf25PyTYn52gqo7WaD8xa0213Md/qVLRg=
|
||||
github.com/klauspost/cpuid/v2 v2.2.10 h1:tBs3QSyvjDyFTq3uoc/9xFpCuOsJQFNPiAhYdw2skhE=
|
||||
github.com/klauspost/cpuid/v2 v2.2.10/go.mod h1:hqwkgyIinND0mEev00jJYCxPNVRVXFQeu1XKlok6oO0=
|
||||
github.com/knz/go-libedit v1.10.1/go.mod h1:MZTVkCWyz0oBc7JOWP3wNAzd002ZbM/5hgShxwh4x8M=
|
||||
github.com/kr/pretty v0.3.0 h1:WgNl7dwNpEZ6jJ9k1snq4pZsg7DOEN8hP9Xw0Tsjwk0=
|
||||
github.com/kr/pretty v0.3.0/go.mod h1:640gp4NfQd8pI5XOwp5fnNeVWj67G7CFk/SaSQn7NBk=
|
||||
github.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=
|
||||
github.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE=
|
||||
github.com/leodido/go-urn v1.4.0 h1:WT9HwE9SGECu3lg4d/dIA+jxlljEa1/ffXKmRjqdmIQ=
|
||||
github.com/leodido/go-urn v1.4.0/go.mod h1:bvxc+MVxLKB4z00jd1z+Dvzr47oO32F/QSNjSBOlFxI=
|
||||
github.com/mattn/go-isatty v0.0.20 h1:xfD0iDuEKnDkl03q4limB+vH+GxLEtL/jb4xVJSWWEY=
|
||||
github.com/mattn/go-isatty v0.0.20/go.mod h1:W+V8PltTTMOvKvAeJH7IuucS94S2C6jfK/D7dTCTo3Y=
|
||||
github.com/modern-go/concurrent v0.0.0-20180228061459-e0a39a4cb421/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=
|
||||
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd h1:TRLaZ9cD/w8PVh93nsPXa1VrQ6jlwL5oN8l14QlcNfg=
|
||||
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=
|
||||
github.com/modern-go/reflect2 v1.0.2 h1:xBagoLtFs94CBntxluKeaWgTMpvLxC4ur3nMaC9Gz0M=
|
||||
github.com/modern-go/reflect2 v1.0.2/go.mod h1:yWuevngMOJpCy52FWWMvUC8ws7m/LJsjYzDa0/r8luk=
|
||||
github.com/pelletier/go-toml/v2 v2.2.4 h1:mye9XuhQ6gvn5h28+VilKrrPoQVanw5PMw/TB0t5Ec4=
|
||||
github.com/pelletier/go-toml/v2 v2.2.4/go.mod h1:2gIqNv+qfxSVS7cM2xJQKtLSTLUE9V8t9Stt+h56mCY=
|
||||
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
|
||||
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
|
||||
github.com/rogpeppe/go-internal v1.14.1 h1:UQB4HGPB6osV0SQTLymcB4TgvyWu6ZyliaW0tI/otEQ=
|
||||
github.com/rogpeppe/go-internal v1.14.1/go.mod h1:MaRKkUm5W0goXpeCfT7UZI6fk/L7L7so1lCWt35ZSgc=
|
||||
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
|
||||
github.com/stretchr/objx v0.4.0/go.mod h1:YvHI0jy2hoMjB+UWwv71VJQ9isScKT/TqJzVSSt89Yw=
|
||||
github.com/stretchr/objx v0.5.0/go.mod h1:Yh+to48EsGEfYuaHDzXPcE3xhTkx73EhmCGUpEOglKo=
|
||||
github.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI=
|
||||
github.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
|
||||
github.com/stretchr/testify v1.7.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
|
||||
github.com/stretchr/testify v1.8.0/go.mod h1:yNjHg4UonilssWZ8iaSj1OCr/vHnekPRkoO+kdMU+MU=
|
||||
github.com/stretchr/testify v1.8.1/go.mod h1:w2LPCIKwWwSfY2zedu0+kehJoqGctiVI29o6fzry7u4=
|
||||
github.com/stretchr/testify v1.10.0 h1:Xv5erBjTwe/5IxqUQTdXv5kgmIvbHo3QQyRwhJsOfJA=
|
||||
github.com/stretchr/testify v1.10.0/go.mod h1:r2ic/lqez/lEtzL7wO/rwa5dbSLXVDPFyf8C91i36aY=
|
||||
github.com/twitchyliquid64/golang-asm v0.15.1 h1:SU5vSMR7hnwNxj24w34ZyCi/FmDZTkS4MhqMhdFk5YI=
|
||||
github.com/twitchyliquid64/golang-asm v0.15.1/go.mod h1:a1lVb/DtPvCB8fslRZhAngC2+aY1QWCk3Cedj/Gdt08=
|
||||
github.com/ugorji/go/codec v1.3.0 h1:Qd2W2sQawAfG8XSvzwhBeoGq71zXOC/Q1E9y/wUcsUA=
|
||||
github.com/ugorji/go/codec v1.3.0/go.mod h1:pRBVtBSKl77K30Bv8R2P+cLSGaTtex6fsA2Wjqmfxj4=
|
||||
golang.org/x/arch v0.18.0 h1:WN9poc33zL4AzGxqf8VtpKUnGvMi8O9lhNyBMF/85qc=
|
||||
golang.org/x/arch v0.18.0/go.mod h1:bdwinDaKcfZUGpH09BB7ZmOfhalA8lQdzl62l8gGWsk=
|
||||
golang.org/x/crypto v0.39.0 h1:SHs+kF4LP+f+p14esP5jAoDpHU8Gu/v9lFRK6IT5imM=
|
||||
golang.org/x/crypto v0.39.0/go.mod h1:L+Xg3Wf6HoL4Bn4238Z6ft6KfEpN0tJGo53AAPC632U=
|
||||
golang.org/x/net v0.41.0 h1:vBTly1HeNPEn3wtREYfy4GZ/NECgw2Cnl+nK6Nz3uvw=
|
||||
golang.org/x/net v0.41.0/go.mod h1:B/K4NNqkfmg07DQYrbwvSluqCJOOXwUjeb/5lOisjbA=
|
||||
golang.org/x/sync v0.15.0 h1:KWH3jNZsfyT6xfAfKiz6MRNmd46ByHDYaZ7KSkCtdW8=
|
||||
golang.org/x/sync v0.15.0/go.mod h1:1dzgHSNfp02xaA81J2MS99Qcpr2w7fw1gpm99rleRqA=
|
||||
golang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.33.0 h1:q3i8TbbEz+JRD9ywIRlyRAQbM0qF7hu24q3teo2hbuw=
|
||||
golang.org/x/sys v0.33.0/go.mod h1:BJP2sWEmIv4KK5OTEluFJCKSidICx8ciO85XgH3Ak8k=
|
||||
golang.org/x/text v0.26.0 h1:P42AVeLghgTYr4+xUnTRKDMqpar+PtX7KWuNQL21L8M=
|
||||
golang.org/x/text v0.26.0/go.mod h1:QK15LZJUUQVJxhz7wXgxSy/CJaTFjd0G+YLonydOVQA=
|
||||
google.golang.org/protobuf v1.36.6 h1:z1NpPI8ku2WgiWnf+t9wTPsn6eP1L7ksHUlkfLvd9xY=
|
||||
google.golang.org/protobuf v1.36.6/go.mod h1:jduwjTPXsFjZGTmRluh+L6NjiWu7pchiJ2/5YcXBHnY=
|
||||
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
|
||||
gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c h1:Hei/4ADfdWqJk1ZMxUNpqntNwaWcugrBjAiHlqqRiVk=
|
||||
gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c/go.mod h1:JHkPIbrfpd72SG/EVd6muEfDQjcINNoR0C8j2r3qZ4Q=
|
||||
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
|
||||
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
|
||||
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
|
||||
gorm.io/driver/postgres v1.6.0 h1:2dxzU8xJ+ivvqTRph34QX+WrRaJlmfyPqXmoGVjMBa4=
|
||||
gorm.io/driver/postgres v1.6.0/go.mod h1:vUw0mrGgrTK+uPHEhAdV4sfFELrByKVGnaVRkXDhtWo=
|
||||
gorm.io/gorm v1.30.0 h1:qbT5aPv1UH8gI99OsRlvDToLxW5zR7FzS9acZDOZcgs=
|
||||
gorm.io/gorm v1.30.0/go.mod h1:8Z33v652h4//uMA76KjeDH8mJXPm1QNCYrMeatR0DOE=
|
||||
nullprogram.com/x/optparse v1.0.0/go.mod h1:KdyPE+Igbe0jQUrVfMqDMeJQIJZEuyV7pjYmp6pbG50=
|
||||
3
oldqc/Backend/go.sum:Zone.Identifier
Normal file
3
oldqc/Backend/go.sum:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
357
oldqc/Backend/main.go
Normal file
357
oldqc/Backend/main.go
Normal file
@ -0,0 +1,357 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"log"
|
||||
"net/http"
|
||||
"os"
|
||||
"verofy-backend/models"
|
||||
"verofy-backend/qc"
|
||||
|
||||
"github.com/gin-contrib/cors"
|
||||
"github.com/gin-gonic/gin"
|
||||
"github.com/joho/godotenv"
|
||||
"gorm.io/driver/postgres"
|
||||
"gorm.io/gorm"
|
||||
)
|
||||
|
||||
var db *gorm.DB
|
||||
|
||||
func getEnv(key, fallback string) string {
|
||||
if value := os.Getenv(key); value != "" {
|
||||
return value
|
||||
}
|
||||
return fallback
|
||||
}
|
||||
|
||||
func initDB() {
|
||||
if err := godotenv.Load(); err != nil {
|
||||
log.Println("No .env file found")
|
||||
}
|
||||
|
||||
dsn := fmt.Sprintf(
|
||||
"host=%s user=%s password=%s dbname=%s port=%s sslmode=require",
|
||||
getEnv("DB_HOST", "localhost"),
|
||||
getEnv("DB_USER", "postgres"),
|
||||
getEnv("DB_PASS", ""),
|
||||
getEnv("DB_NAME", "verofy"),
|
||||
getEnv("DB_PORT", "5432"),
|
||||
)
|
||||
|
||||
var err error
|
||||
db, err = gorm.Open(postgres.Open(dsn), &gorm.Config{})
|
||||
if err != nil {
|
||||
log.Fatal("Failed to connect to database:", err)
|
||||
}
|
||||
}
|
||||
|
||||
// Helper function to parse geometry from json.RawMessage
|
||||
func parseGeometry(rawGeometry json.RawMessage) interface{} {
|
||||
if len(rawGeometry) == 0 {
|
||||
return nil
|
||||
}
|
||||
|
||||
var geometry interface{}
|
||||
if err := json.Unmarshal(rawGeometry, &geometry); err != nil {
|
||||
log.Printf("Failed to parse geometry: %v", err)
|
||||
return nil
|
||||
}
|
||||
return geometry
|
||||
}
|
||||
|
||||
func main() {
|
||||
initDB()
|
||||
|
||||
// Define configuration variables
|
||||
schema := getEnv("SCHEMA_NAME", "eli_test")
|
||||
segmentTable := getEnv("SEGMENT_TABLE", "segment2")
|
||||
zoneCol := getEnv("ZONE_COLUMN", "group_1")
|
||||
mapIDCol := getEnv("MAPID_COLUMN", "mapid")
|
||||
idCol := getEnv("ID_COLUMN", "id")
|
||||
qcFlagCol := getEnv("QCFLAG_COLUMN", "qc_flag")
|
||||
serverPort := getEnv("SERVER_PORT", "8080")
|
||||
|
||||
router := gin.Default()
|
||||
router.Use(cors.Default())
|
||||
|
||||
router.Static("/static", "../Frontend")
|
||||
|
||||
router.GET("/", func(c *gin.Context) {
|
||||
c.File("../Frontend/index.html")
|
||||
})
|
||||
|
||||
// Register QC routes
|
||||
qc.GraphConnectivityRoute(router, db, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol)
|
||||
qc.SingleSpanRoute(router, db, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol)
|
||||
qc.SiteConnectivityRoute(router, db, schema)
|
||||
qc.UndergroundEndpointsRoute(router, db, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol)
|
||||
qc.AerialEndpointsRoute(router, db, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol)
|
||||
qc.ZoneContainmentRoute(router, db, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol)
|
||||
|
||||
router.GET("/api/markets", func(c *gin.Context) {
|
||||
var markets []models.MarketOption
|
||||
table := fmt.Sprintf("%s.map_projects", schema)
|
||||
db.Table(table).Select("mapid, TRIM(project) as project").Where("mapid IS NOT NULL").Order("project").Scan(&markets)
|
||||
c.JSON(http.StatusOK, markets)
|
||||
})
|
||||
|
||||
router.GET("/api/zones", func(c *gin.Context) {
|
||||
mapID := c.Query("map_id")
|
||||
if mapID == "" {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": "Missing map_id query parameter"})
|
||||
return
|
||||
}
|
||||
|
||||
var zones []string
|
||||
table := fmt.Sprintf("%s.%s", schema, segmentTable)
|
||||
db.Table(table).Where(fmt.Sprintf("%s = ? AND %s IS NOT NULL", mapIDCol, zoneCol), mapID).Distinct(zoneCol).Pluck(zoneCol, &zones)
|
||||
c.JSON(http.StatusOK, zones)
|
||||
})
|
||||
|
||||
router.GET("/api/segments", func(c *gin.Context) {
|
||||
mapID := c.Query("map_id")
|
||||
zone := c.Query("zone")
|
||||
|
||||
var segments []struct {
|
||||
ID0 int `gorm:"column:id_0"`
|
||||
MapID int `gorm:"column:mapid"`
|
||||
SegmentType string `gorm:"column:segment_type"`
|
||||
SegmentStatus string `gorm:"column:segment_status"`
|
||||
ID int `gorm:"column:id"`
|
||||
ProtectionStatus string `gorm:"column:protection_status"`
|
||||
QCFlag string `gorm:"column:qc_flag"`
|
||||
Group1 *string `gorm:"column:group_1"`
|
||||
Geometry json.RawMessage `gorm:"column:geometry"`
|
||||
}
|
||||
table := fmt.Sprintf("%s.%s", schema, segmentTable)
|
||||
query := db.Table(table).Select(fmt.Sprintf("id_0, %s, segment_type, segment_status, %s, protection_status, %s, \"%s\" as group_1, ST_AsGeoJSON(ST_Transform(geom, 4326))::json AS geometry", mapIDCol, idCol, qcFlagCol, zoneCol))
|
||||
|
||||
if mapID != "" {
|
||||
query = query.Where(fmt.Sprintf("%s = ?", mapIDCol), mapID)
|
||||
}
|
||||
if zone != "" {
|
||||
query = query.Where(fmt.Sprintf("\"%s\" = ?", zoneCol), zone)
|
||||
}
|
||||
|
||||
query.Find(&segments)
|
||||
|
||||
features := []map[string]interface{}{}
|
||||
for _, s := range segments {
|
||||
features = append(features, map[string]interface{}{
|
||||
"type": "Feature",
|
||||
"geometry": parseGeometry(s.Geometry),
|
||||
"properties": map[string]interface{}{
|
||||
"id_0": s.ID0,
|
||||
"mapid": s.MapID,
|
||||
"segment_type": s.SegmentType,
|
||||
"segment_status": s.SegmentStatus,
|
||||
"id": s.ID,
|
||||
"protection_status": s.ProtectionStatus,
|
||||
"qc_flag": s.QCFlag,
|
||||
"group_1": s.Group1,
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
c.JSON(http.StatusOK, map[string]interface{}{
|
||||
"type": "FeatureCollection",
|
||||
"features": features,
|
||||
})
|
||||
})
|
||||
|
||||
// SITES
|
||||
router.GET("/api/sites", func(c *gin.Context) {
|
||||
mapID := c.Query("mapprojectid")
|
||||
|
||||
var sites []struct {
|
||||
GID int `gorm:"column:gid"`
|
||||
ID *int `gorm:"column:id"`
|
||||
MapProjectID *int `gorm:"column:mapprojectid"`
|
||||
Name *string `gorm:"column:name"`
|
||||
Address1 *string `gorm:"column:address"`
|
||||
City *string `gorm:"column:city"`
|
||||
State *string `gorm:"column:state"`
|
||||
Zip *string `gorm:"column:zip"`
|
||||
Group1 *string `gorm:"column:group1"`
|
||||
Geometry json.RawMessage `gorm:"column:geometry"`
|
||||
}
|
||||
table := fmt.Sprintf("%s.sites", schema)
|
||||
query := db.Table(table).Select("gid, id, \"MapProjectID\" as mapprojectid, \"Name\" as name, \"Address1\" as address, \"City\" as city, \"State\" as state, \"Zip\" as zip, \"Group 1\" as group1, ST_AsGeoJSON(geometry)::json AS geometry")
|
||||
|
||||
if mapID != "" {
|
||||
query = query.Where("\"MapProjectID\" = ?", mapID)
|
||||
}
|
||||
|
||||
query.Find(&sites)
|
||||
|
||||
features := []map[string]interface{}{}
|
||||
for _, s := range sites {
|
||||
features = append(features, map[string]interface{}{
|
||||
"type": "Feature",
|
||||
"geometry": parseGeometry(s.Geometry),
|
||||
"properties": map[string]interface{}{
|
||||
"gid": s.GID,
|
||||
"id": s.ID,
|
||||
"mapprojectid": s.MapProjectID,
|
||||
"name": s.Name,
|
||||
"address": s.Address1,
|
||||
"city": s.City,
|
||||
"state": s.State,
|
||||
"zip": s.Zip,
|
||||
"group1": s.Group1,
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
c.JSON(http.StatusOK, map[string]interface{}{
|
||||
"type": "FeatureCollection",
|
||||
"features": features,
|
||||
})
|
||||
})
|
||||
|
||||
// POLES
|
||||
router.GET("/api/poles", func(c *gin.Context) {
|
||||
mapID := c.Query("map_id")
|
||||
|
||||
var poles []models.PolesGeoJSON
|
||||
table := fmt.Sprintf("%s.poles", schema)
|
||||
query := db.Table(table).Select("gid, id, mapprojectid, name, tags, group1, group2, owner, poleheight, attachmentheight, ST_AsGeoJSON(ST_Transform(geom, 4326))::json AS geometry")
|
||||
|
||||
if mapID != "" {
|
||||
query = query.Where("mapprojectid = ?", mapID)
|
||||
}
|
||||
|
||||
query.Find(&poles)
|
||||
|
||||
features := []map[string]interface{}{}
|
||||
for _, p := range poles {
|
||||
features = append(features, map[string]interface{}{
|
||||
"type": "Feature",
|
||||
"geometry": parseGeometry(p.Geometry),
|
||||
"properties": map[string]interface{}{
|
||||
"gid": p.GID,
|
||||
"id": p.ID,
|
||||
"mapprojectid": p.MapProjectID,
|
||||
"name": p.Name,
|
||||
"tags": p.Tags,
|
||||
"group1": p.Group1,
|
||||
"group2": p.Group2,
|
||||
"owner": p.Owner,
|
||||
"poleheight": p.PoleHeight,
|
||||
"attachmentheight": p.AttachmentHeight,
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
c.JSON(http.StatusOK, map[string]interface{}{
|
||||
"type": "FeatureCollection",
|
||||
"features": features,
|
||||
})
|
||||
})
|
||||
|
||||
// Access_Points
|
||||
router.GET("/api/access_points", func(c *gin.Context) {
|
||||
mapID := c.Query("map_id")
|
||||
|
||||
var accessPoints []models.AccessPointGeoJSON
|
||||
table := fmt.Sprintf("%s.access_points", schema)
|
||||
query := db.Table(table).Select(`
|
||||
gid, id, name, mapprojectid, latitude, longitude, manufacturer, size, locked, description, aka,
|
||||
createdby, createddate, modifiedby, modifieddate, historyid, group1, group2, typeid, statusid,
|
||||
crmvendorid, billdate,
|
||||
ST_AsGeoJSON(ST_Transform(geom, 4326))::json AS geometry
|
||||
`)
|
||||
|
||||
if mapID != "" {
|
||||
query = query.Where("mapprojectid = ?", mapID)
|
||||
}
|
||||
|
||||
query.Find(&accessPoints)
|
||||
|
||||
features := []map[string]interface{}{}
|
||||
for _, ap := range accessPoints {
|
||||
features = append(features, map[string]interface{}{
|
||||
"type": "Feature",
|
||||
"geometry": parseGeometry(ap.Geometry),
|
||||
"properties": map[string]interface{}{
|
||||
"gid": ap.GID,
|
||||
"id": ap.ID,
|
||||
"name": ap.Name,
|
||||
"mapprojectid": ap.MapProjectID,
|
||||
"latitude": ap.Latitude,
|
||||
"longitude": ap.Longitude,
|
||||
"manufacturer": ap.Manufacturer,
|
||||
"size": ap.Size,
|
||||
"locked": ap.Locked,
|
||||
"description": ap.Description,
|
||||
"aka": ap.AKA,
|
||||
"createdby": ap.CreatedBy,
|
||||
"createddate": ap.CreatedDate,
|
||||
"modifiedby": ap.ModifiedBy,
|
||||
"modifieddate": ap.ModifiedDate,
|
||||
"historyid": ap.HistoryID,
|
||||
"group1": ap.Group1,
|
||||
"group2": ap.Group2,
|
||||
"typeid": ap.TypeID,
|
||||
"statusid": ap.StatusID,
|
||||
"crmvendorid": ap.CRMVendorID,
|
||||
"billdate": ap.BillDate,
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
c.JSON(http.StatusOK, map[string]interface{}{
|
||||
"type": "FeatureCollection",
|
||||
"features": features,
|
||||
})
|
||||
})
|
||||
|
||||
// Info Objects - FIXED
|
||||
router.GET("/api/info", func(c *gin.Context) {
|
||||
mapID := c.Query("map_id")
|
||||
|
||||
var infos []models.InfoGeoJSON
|
||||
table := fmt.Sprintf("%s.info", schema)
|
||||
query := db.Table(table).Select(`
|
||||
id, name, tags, description, group_1, group_2,
|
||||
ST_AsGeoJSON(geom)::json AS geometry
|
||||
`)
|
||||
|
||||
if mapID != "" {
|
||||
query = query.Where("mapprojectid = ?", mapID)
|
||||
}
|
||||
|
||||
if err := query.Find(&infos).Error; err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
return
|
||||
}
|
||||
|
||||
features := []map[string]interface{}{}
|
||||
for _, info := range infos {
|
||||
features = append(features, map[string]interface{}{
|
||||
"type": "Feature",
|
||||
"geometry": parseGeometry(info.Geometry),
|
||||
"properties": map[string]interface{}{
|
||||
"id": info.ID,
|
||||
"name": info.Name,
|
||||
"tags": info.Tags,
|
||||
"description": info.Description,
|
||||
"group_1": info.Group1,
|
||||
"group_2": info.Group2,
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
c.JSON(http.StatusOK, map[string]interface{}{
|
||||
"type": "FeatureCollection",
|
||||
"features": features,
|
||||
})
|
||||
})
|
||||
|
||||
// Server Start
|
||||
log.Printf("Server is running on http://localhost:%s", serverPort)
|
||||
if err := router.Run(":" + serverPort); err != nil {
|
||||
log.Fatal("Server failed:", err)
|
||||
}
|
||||
}
|
||||
3
oldqc/Backend/main.go:Zone.Identifier
Normal file
3
oldqc/Backend/main.go:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
18
oldqc/Backend/migrations/add_site_connectivity_fields.sql
Normal file
18
oldqc/Backend/migrations/add_site_connectivity_fields.sql
Normal file
@ -0,0 +1,18 @@
|
||||
-- Add connectivity fields to sites table for site connectivity QC
|
||||
-- Run this script against your database to add the required columns
|
||||
|
||||
-- Add connectivity_status column (connected/disconnected)
|
||||
ALTER TABLE eli_test.sites
|
||||
ADD COLUMN IF NOT EXISTS connectivity_status VARCHAR(20) DEFAULT NULL;
|
||||
|
||||
-- Add connectivity_distance column (distance to nearest segment in meters)
|
||||
ALTER TABLE eli_test.sites
|
||||
ADD COLUMN IF NOT EXISTS connectivity_distance FLOAT DEFAULT NULL;
|
||||
|
||||
-- Create index for performance on connectivity queries
|
||||
CREATE INDEX IF NOT EXISTS idx_sites_connectivity_status
|
||||
ON eli_test.sites(connectivity_status);
|
||||
|
||||
-- Optional: Add comments to document the columns
|
||||
COMMENT ON COLUMN eli_test.sites.connectivity_status IS 'Site connectivity status: connected/disconnected based on distance to network';
|
||||
COMMENT ON COLUMN eli_test.sites.connectivity_distance IS 'Distance in meters to nearest network segment';
|
||||
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
172
oldqc/Backend/models/models.go
Normal file
172
oldqc/Backend/models/models.go
Normal file
@ -0,0 +1,172 @@
|
||||
package models
|
||||
|
||||
import "encoding/json"
|
||||
|
||||
type Segment struct {
|
||||
ID0 int `gorm:"column:id_0;primaryKey" json:"id_0"`
|
||||
MapID int `gorm:"column:mapid" json:"mapid"`
|
||||
SegmentType string `gorm:"column:segment_type" json:"segment_type"`
|
||||
SegmentStatus string `gorm:"column:segment_status" json:"segment_status"`
|
||||
ID int `gorm:"column:id" json:"id"`
|
||||
ProtectionStatus string `gorm:"column:protection_status" json:"protection_status"`
|
||||
QCFlag string `gorm:"column:qc_flag" json:"qc_flag"`
|
||||
Geometry json.RawMessage `gorm:"column:geometry" json:"geometry"`
|
||||
}
|
||||
|
||||
func (Segment) TableName() string {
|
||||
return "eli_test.segment2"
|
||||
}
|
||||
|
||||
// New struct for GeoJSON response with Geometry as raw JSON
|
||||
type SegmentGeoJSON struct {
|
||||
ID0 int `gorm:"column:id_0" json:"id_0"`
|
||||
MapID int `gorm:"column:mapid" json:"mapid"`
|
||||
SegmentType string `gorm:"column:segment_type" json:"segment_type"`
|
||||
SegmentStatus string `gorm:"column:segment_status" json:"segment_status"`
|
||||
ID int `gorm:"column:id" json:"id"`
|
||||
ProtectionStatus string `gorm:"column:protection_status" json:"protection_status"`
|
||||
QCFlag string `gorm:"column:qc_flag" json:"qc_flag"`
|
||||
Geometry json.RawMessage `gorm:"column:geometry" json:"geometry"` // Added missing geometry field
|
||||
}
|
||||
|
||||
// Sites struct (exported, with tags and GORM column names)
|
||||
type Sites struct {
|
||||
GID int `json:"gid" gorm:"primaryKey;column:gid"`
|
||||
ID *int `json:"id" gorm:"column:id"`
|
||||
MapProjectID *int `json:"mapprojectid" gorm:"column:mapprojectid"`
|
||||
Longitude *string `json:"longitude" gorm:"column:longitude"`
|
||||
Latitude *string `json:"latitude" gorm:"column:latitude"`
|
||||
Exclude *int `json:"exclude" gorm:"column:exclude"`
|
||||
Custom *int `json:"custom" gorm:"column:custom"`
|
||||
Color *string `json:"color" gorm:"column:color"`
|
||||
Opacity *string `json:"opacity" gorm:"column:opacity"`
|
||||
ShapeID *string `json:"shapeid" gorm:"column:shapeid"`
|
||||
StyleSize *string `json:"stylesize" gorm:"column:stylesize"`
|
||||
CreatedBy *int `json:"createdby" gorm:"column:createdby"`
|
||||
CreatedDate *int `json:"createddate" gorm:"column:createddate"`
|
||||
ModifiedBy *int `json:"modifiedby" gorm:"column:modifiedby"`
|
||||
ModifiedDate *int `json:"modifieddate" gorm:"column:modifieddate"`
|
||||
HistoryID *int `json:"historyid" gorm:"column:historyid"`
|
||||
Name *string `json:"name" gorm:"column:name"`
|
||||
StatusID *int `json:"statusid" gorm:"column:statusid"`
|
||||
Group1 *string `json:"group1" gorm:"column:group1"`
|
||||
Group2 *string `json:"group2" gorm:"column:group2"`
|
||||
IconTypeID *int `json:"icontypeid" gorm:"column:icontypeid"`
|
||||
SchoolID *string `json:"schoolid" gorm:"column:schoolid"`
|
||||
SiteDemarc *string `json:"sitedemarc" gorm:"column:sitedemarc"`
|
||||
Address1 *string `json:"address1" gorm:"column:address1"`
|
||||
Address2 *string `json:"address2" gorm:"column:address2"`
|
||||
City *string `json:"city" gorm:"column:city"`
|
||||
State *string `json:"state" gorm:"column:state"`
|
||||
Zip *string `json:"zip" gorm:"column:zip"`
|
||||
ConnectivityStatus *string `json:"connectivity_status" gorm:"column:connectivity_status"`
|
||||
ConnectivityDistance *float64 `json:"connectivity_distance" gorm:"column:connectivity_distance"`
|
||||
}
|
||||
|
||||
// SitesGeoJSON struct (for your geojson API response)
|
||||
type SitesGeoJSON struct {
|
||||
GID int `json:"gid" gorm:"column:gid"`
|
||||
ID *int `json:"id" gorm:"column:id"`
|
||||
MapProjectID *int `json:"mapprojectid" gorm:"column:mapprojectid"`
|
||||
Name *string `json:"name" gorm:"column:name"`
|
||||
Address1 *string `json:"address1" gorm:"column:address1"`
|
||||
City *string `json:"city" gorm:"column:city"`
|
||||
State *string `json:"state" gorm:"column:state"`
|
||||
Zip *string `json:"zip" gorm:"column:zip"`
|
||||
Geometry json.RawMessage `json:"geometry" gorm:"column:geometry"`
|
||||
}
|
||||
|
||||
// Poles struct (exported, full DB mapping)
|
||||
type Poles struct {
|
||||
GID int `json:"gid" gorm:"primaryKey;column:gid"`
|
||||
ID *int `json:"id" gorm:"column:id"`
|
||||
MapProjectID *int `json:"mapprojectid" gorm:"column:mapprojectid"`
|
||||
Latitude *string `json:"latitude" gorm:"column:latitude"`
|
||||
Longitude *string `json:"longitude" gorm:"column:longitude"`
|
||||
Custom *int `json:"custom" gorm:"column:custom"`
|
||||
Color *string `json:"color" gorm:"column:color"`
|
||||
ShapeID *string `json:"shapeid" gorm:"column:shapeid"`
|
||||
StyleSize *string `json:"stylesize" gorm:"column:stylesize"`
|
||||
Opacity *string `json:"opacity" gorm:"column:opacity"`
|
||||
CreatedBy *int `json:"createdby" gorm:"column:createdby"`
|
||||
CreatedDate *int `json:"createddate" gorm:"column:createddate"`
|
||||
ModifiedBy *int `json:"modifiedby" gorm:"column:modifiedby"`
|
||||
ModifiedDate *int `json:"modifieddate" gorm:"column:modifieddate"`
|
||||
HistoryID *int `json:"historyid" gorm:"column:historyid"`
|
||||
Name *string `json:"name" gorm:"column:name"`
|
||||
Tags *string `json:"tags" gorm:"column:tags"`
|
||||
Group1 *string `json:"group1" gorm:"column:group1"`
|
||||
Group2 *string `json:"group2" gorm:"column:group2"`
|
||||
MRStateID *int `json:"mrstateid" gorm:"column:mrstateid"`
|
||||
CommsMRChoiceID *int `json:"commsmrchoiceid" gorm:"column:commsmrchoiceid"`
|
||||
PowerMRChoiceID *string `json:"powermrchoiceid" gorm:"column:powermrchoiceid"`
|
||||
PoleHeight *string `json:"poleheight" gorm:"column:poleheight"`
|
||||
AttachmentHeight *string `json:"attachmentheight" gorm:"column:attachmentheight"`
|
||||
MRNotes *string `json:"mrnotes" gorm:"column:mrnotes"`
|
||||
Owner *string `json:"owner" gorm:"column:owner"`
|
||||
Geom []byte `json:"geom" gorm:"column:geom"`
|
||||
}
|
||||
|
||||
// PolesGeoJSON struct (for geojson response)
|
||||
type PolesGeoJSON struct {
|
||||
GID int `json:"gid" gorm:"column:gid"`
|
||||
ID *int `json:"id" gorm:"column:id"`
|
||||
MapProjectID *int `json:"mapprojectid" gorm:"column:mapprojectid"`
|
||||
Name *string `json:"name" gorm:"column:name"`
|
||||
Tags *string `json:"tags" gorm:"column:tags"`
|
||||
Group1 *string `json:"group1" gorm:"column:group1"`
|
||||
Group2 *string `json:"group2" gorm:"column:group2"`
|
||||
Owner *string `json:"owner" gorm:"column:owner"`
|
||||
PoleHeight *string `json:"poleheight" gorm:"column:poleheight"`
|
||||
AttachmentHeight *string `json:"attachmentheight" gorm:"column:attachmentheight"`
|
||||
Geometry json.RawMessage `json:"geometry" gorm:"column:geometry"`
|
||||
}
|
||||
|
||||
type AccessPointGeoJSON struct {
|
||||
GID int `json:"gid" gorm:"column:gid"`
|
||||
ID *int `json:"id" gorm:"column:id"`
|
||||
Name *string `json:"name" gorm:"column:name"`
|
||||
MapProjectID *int `json:"mapprojectid" gorm:"column:mapprojectid"`
|
||||
Latitude *string `json:"latitude" gorm:"column:latitude"`
|
||||
Longitude *string `json:"longitude" gorm:"column:longitude"`
|
||||
Manufacturer *string `json:"manufacturer" gorm:"column:manufacturer"`
|
||||
Size *string `json:"size" gorm:"column:size"`
|
||||
Locked *int `json:"locked" gorm:"column:locked"`
|
||||
Description *string `json:"description" gorm:"column:description"`
|
||||
AKA *string `json:"aka" gorm:"column:aka"`
|
||||
CreatedBy *int `json:"createdby" gorm:"column:createdby"`
|
||||
CreatedDate *int `json:"createddate" gorm:"column:createddate"`
|
||||
ModifiedBy *string `json:"modifiedby" gorm:"column:modifiedby"`
|
||||
ModifiedDate *string `json:"modifieddate" gorm:"column:modifieddate"`
|
||||
HistoryID *int `json:"historyid" gorm:"column:historyid"`
|
||||
Group1 *string `json:"group1" gorm:"column:group1"`
|
||||
Group2 *string `json:"group2" gorm:"column:group2"`
|
||||
TypeID *int `json:"typeid" gorm:"column:typeid"`
|
||||
StatusID *int `json:"statusid" gorm:"column:statusid"`
|
||||
CRMVendorID *string `json:"crmvendorid" gorm:"column:crmvendorid"`
|
||||
BillDate *string `json:"billdate" gorm:"column:billdate"`
|
||||
Geometry json.RawMessage `json:"geometry" gorm:"column:geometry"` // Changed to json.RawMessage
|
||||
}
|
||||
|
||||
func (AccessPointGeoJSON) TableName() string {
|
||||
return "verofy.access_points"
|
||||
}
|
||||
|
||||
type InfoGeoJSON struct {
|
||||
ID int `json:"id" gorm:"primaryKey;column:id"`
|
||||
Name *string `json:"name" gorm:"column:name"`
|
||||
Tags *string `json:"tags" gorm:"column:tags"`
|
||||
Description *string `json:"description" gorm:"column:description"`
|
||||
Group1 *string `json:"group_1" gorm:"column:group_1"`
|
||||
Group2 *string `json:"group_2" gorm:"column:group_2"`
|
||||
Geometry json.RawMessage `json:"geometry" gorm:"column:geometry"` // Fixed column name
|
||||
}
|
||||
|
||||
func (InfoGeoJSON) TableName() string {
|
||||
return "verofy.Info"
|
||||
}
|
||||
|
||||
type MarketOption struct {
|
||||
MapID int `json:"mapid" gorm:"column:mapid"`
|
||||
Project string `json:"project" gorm:"column:project"`
|
||||
}
|
||||
3
oldqc/Backend/models/models.go:Zone.Identifier
Normal file
3
oldqc/Backend/models/models.go:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
246
oldqc/Backend/qc/aerial_endpoints.go
Normal file
246
oldqc/Backend/qc/aerial_endpoints.go
Normal file
@ -0,0 +1,246 @@
|
||||
package qc
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"net/http"
|
||||
"verofy-backend/models"
|
||||
|
||||
"github.com/gin-gonic/gin"
|
||||
"gorm.io/gorm"
|
||||
)
|
||||
|
||||
type AerialEndpointResult struct {
|
||||
SegmentID int `json:"segment_id"`
|
||||
SegmentName string `json:"segment_name"`
|
||||
Type string `json:"type"`
|
||||
IsValid bool `json:"is_valid"`
|
||||
ErrorMessage string `json:"error_message,omitempty"`
|
||||
StartPoleCount int `json:"start_pole_count"`
|
||||
EndPoleCount int `json:"end_pole_count"`
|
||||
StartPoleIDs []int `json:"start_pole_ids,omitempty"`
|
||||
EndPoleIDs []int `json:"end_pole_ids,omitempty"`
|
||||
Geometry map[string]interface{} `json:"geometry,omitempty"`
|
||||
}
|
||||
|
||||
type AerialEndpointSummary struct {
|
||||
TotalAerialSegments int `json:"total_aerial_segments"`
|
||||
ValidSegments int `json:"valid_segments"`
|
||||
InvalidSegments int `json:"invalid_segments"`
|
||||
PassRate float64 `json:"pass_rate"`
|
||||
Results []AerialEndpointResult `json:"results"`
|
||||
}
|
||||
|
||||
func AerialEndpointsRoute(router *gin.Engine, db *gorm.DB, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol string) {
|
||||
// Full aerial endpoints summary endpoint
|
||||
router.GET("/api/qc/aerial-endpoints", func(c *gin.Context) {
|
||||
mapID := c.Query("map_id")
|
||||
zone := c.Query("zone")
|
||||
if mapID == "" || zone == "" {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": "map_id and zone are required"})
|
||||
return
|
||||
}
|
||||
|
||||
summary, err := CheckAerialEndpoints(db, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
return
|
||||
}
|
||||
|
||||
c.JSON(http.StatusOK, summary)
|
||||
})
|
||||
|
||||
// Invalid segments only endpoint
|
||||
router.GET("/api/qc/aerial-endpoints/invalid", func(c *gin.Context) {
|
||||
mapID := c.Query("map_id")
|
||||
zone := c.Query("zone")
|
||||
if mapID == "" || zone == "" {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": "map_id and zone are required"})
|
||||
return
|
||||
}
|
||||
|
||||
invalid, err := GetInvalidAerialEndpoints(db, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
return
|
||||
}
|
||||
|
||||
c.JSON(http.StatusOK, gin.H{
|
||||
"invalid_segments": invalid,
|
||||
"count": len(invalid),
|
||||
})
|
||||
})
|
||||
|
||||
// Update QC flags endpoint
|
||||
router.POST("/api/qc/aerial-endpoints/update-flags", func(c *gin.Context) {
|
||||
var request struct {
|
||||
SegmentIDs []int `json:"segment_ids"`
|
||||
MapID string `json:"map_id"`
|
||||
Zone string `json:"zone"`
|
||||
}
|
||||
|
||||
if err := c.ShouldBindJSON(&request); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
return
|
||||
}
|
||||
|
||||
err := UpdateAerialEndpointFlags(db, request.SegmentIDs, schema, segmentTable, idCol, qcFlagCol)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
return
|
||||
}
|
||||
|
||||
c.JSON(http.StatusOK, gin.H{"message": fmt.Sprintf("Updated QC flags for %d segments", len(request.SegmentIDs))})
|
||||
})
|
||||
}
|
||||
|
||||
// CheckAerialEndpoints validates that aerial segments have exactly one pole at each endpoint
|
||||
func CheckAerialEndpoints(db *gorm.DB, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol string) (*AerialEndpointSummary, error) {
|
||||
var segments []models.SegmentGeoJSON
|
||||
table := fmt.Sprintf("%s.%s", schema, segmentTable)
|
||||
|
||||
// Query aerial segments
|
||||
err := db.Table(table).
|
||||
Select(fmt.Sprintf("id_0, %s, segment_type, segment_status, %s, protection_status, %s, ST_AsGeoJSON(ST_Transform(geom, 4326))::json AS geometry", mapIDCol, idCol, qcFlagCol)).
|
||||
Where(fmt.Sprintf("%s = ? AND %s = ? AND LOWER(segment_type) = ?", mapIDCol, zoneCol), mapID, zone, "aerial").
|
||||
Find(&segments).Error
|
||||
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to fetch aerial segments: %w", err)
|
||||
}
|
||||
|
||||
// Get poles for the same map
|
||||
poles, err := getPoles(db, mapID, schema)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to fetch poles: %w", err)
|
||||
}
|
||||
|
||||
summary := &AerialEndpointSummary{
|
||||
TotalAerialSegments: len(segments),
|
||||
Results: make([]AerialEndpointResult, 0, len(segments)),
|
||||
}
|
||||
|
||||
for _, segment := range segments {
|
||||
result := validateAerialEndpoints(segment, poles)
|
||||
summary.Results = append(summary.Results, result)
|
||||
|
||||
if result.IsValid {
|
||||
summary.ValidSegments++
|
||||
} else {
|
||||
summary.InvalidSegments++
|
||||
}
|
||||
}
|
||||
|
||||
// Calculate pass rate
|
||||
if summary.TotalAerialSegments > 0 {
|
||||
summary.PassRate = float64(summary.ValidSegments) / float64(summary.TotalAerialSegments) * 100
|
||||
}
|
||||
|
||||
return summary, nil
|
||||
}
|
||||
|
||||
// validateAerialEndpoints checks if aerial segment has exactly one pole at each endpoint
|
||||
func validateAerialEndpoints(segment models.SegmentGeoJSON, poles []models.PolesGeoJSON) AerialEndpointResult {
|
||||
result := AerialEndpointResult{
|
||||
SegmentID: int(segment.ID),
|
||||
Type: segment.SegmentType,
|
||||
IsValid: false,
|
||||
}
|
||||
|
||||
// Parse the geometry to get start and end coordinates
|
||||
if len(segment.Geometry) == 0 {
|
||||
result.ErrorMessage = "Segment has no geometry data"
|
||||
return result
|
||||
}
|
||||
|
||||
startCoord, endCoord, geometry, err := getSegmentEndpoints(segment.Geometry)
|
||||
if err != nil {
|
||||
result.ErrorMessage = fmt.Sprintf("Failed to parse geometry: %v", err)
|
||||
return result
|
||||
}
|
||||
|
||||
result.Geometry = geometry
|
||||
|
||||
// Check for poles near start and end coordinates
|
||||
// Using a buffer distance of ~10 meters (0.0001 degrees approximately)
|
||||
bufferDistance := 0.0001
|
||||
|
||||
startPoles, startPoleIDs := getPolesNearCoordinate(startCoord, poles, bufferDistance)
|
||||
endPoles, endPoleIDs := getPolesNearCoordinate(endCoord, poles, bufferDistance)
|
||||
|
||||
result.StartPoleCount = startPoles
|
||||
result.EndPoleCount = endPoles
|
||||
result.StartPoleIDs = startPoleIDs
|
||||
result.EndPoleIDs = endPoleIDs
|
||||
|
||||
// Valid if exactly ONE pole at each endpoint
|
||||
if startPoles == 1 && endPoles == 1 {
|
||||
result.IsValid = true
|
||||
} else {
|
||||
errorParts := []string{}
|
||||
if startPoles == 0 {
|
||||
errorParts = append(errorParts, "no pole at start")
|
||||
} else if startPoles > 1 {
|
||||
errorParts = append(errorParts, fmt.Sprintf("%d poles at start (should be 1)", startPoles))
|
||||
}
|
||||
if endPoles == 0 {
|
||||
errorParts = append(errorParts, "no pole at end")
|
||||
} else if endPoles > 1 {
|
||||
errorParts = append(errorParts, fmt.Sprintf("%d poles at end (should be 1)", endPoles))
|
||||
}
|
||||
|
||||
if len(errorParts) > 0 {
|
||||
result.ErrorMessage = fmt.Sprintf("Aerial segment pole issues: %v", errorParts)
|
||||
}
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
// getPolesNearCoordinate counts poles within buffer distance of coordinate and returns their IDs
|
||||
func getPolesNearCoordinate(coord [2]float64, poles []models.PolesGeoJSON, buffer float64) (int, []int) {
|
||||
count := 0
|
||||
poleIDs := []int{}
|
||||
|
||||
for _, pole := range poles {
|
||||
if poleCoord, err := getPointCoordinates(pole.Geometry); err == nil {
|
||||
if distance(coord, poleCoord) <= buffer {
|
||||
count++
|
||||
if pole.ID != nil {
|
||||
poleIDs = append(poleIDs, *pole.ID)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return count, poleIDs
|
||||
}
|
||||
|
||||
// GetInvalidAerialEndpoints returns only the segments that failed the check
|
||||
func GetInvalidAerialEndpoints(db *gorm.DB, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol string) ([]AerialEndpointResult, error) {
|
||||
summary, err := CheckAerialEndpoints(db, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
var invalid []AerialEndpointResult
|
||||
for _, result := range summary.Results {
|
||||
if !result.IsValid {
|
||||
invalid = append(invalid, result)
|
||||
}
|
||||
}
|
||||
|
||||
return invalid, nil
|
||||
}
|
||||
|
||||
// UpdateAerialEndpointFlags updates QC flags for invalid segments
|
||||
func UpdateAerialEndpointFlags(db *gorm.DB, segmentIDs []int, schema, segmentTable, idCol, qcFlagCol string) error {
|
||||
if len(segmentIDs) == 0 {
|
||||
return nil
|
||||
}
|
||||
|
||||
table := fmt.Sprintf("%s.%s", schema, segmentTable)
|
||||
|
||||
return db.Table(table).
|
||||
Where(fmt.Sprintf("%s IN ?", idCol), segmentIDs).
|
||||
Update(qcFlagCol, "aerial_endpoint_issue").Error
|
||||
}
|
||||
3
oldqc/Backend/qc/aerial_endpoints.go:Zone.Identifier
Normal file
3
oldqc/Backend/qc/aerial_endpoints.go:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
53
oldqc/Backend/qc/graph_connect.go
Normal file
53
oldqc/Backend/qc/graph_connect.go
Normal file
@ -0,0 +1,53 @@
|
||||
package qc
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"net/http"
|
||||
"verofy-backend/models"
|
||||
|
||||
"github.com/gin-gonic/gin"
|
||||
"gorm.io/gorm"
|
||||
)
|
||||
|
||||
func GraphConnectivityRoute(router *gin.Engine, db *gorm.DB, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol string) {
|
||||
router.GET("/api/qc/connectivity", func(c *gin.Context) {
|
||||
mapID := c.Query("map_id")
|
||||
zone := c.Query("zone")
|
||||
if mapID == "" || zone == "" {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": "map_id and zone are required"})
|
||||
return
|
||||
}
|
||||
|
||||
var segments []models.SegmentGeoJSON
|
||||
table := fmt.Sprintf("%s.%s", schema, segmentTable)
|
||||
|
||||
db.Table(table).Select(fmt.Sprintf("id_0, %s, segment_type, segment_status, %s, protection_status, %s, ST_AsGeoJSON(ST_Transform(geom, 4326))::json AS geometry", mapIDCol, idCol, qcFlagCol)).
|
||||
Where(fmt.Sprintf("%s = ? AND %s = ?", mapIDCol, zoneCol), mapID, zone).
|
||||
Find(&segments)
|
||||
|
||||
features := []map[string]interface{}{}
|
||||
for _, s := range segments {
|
||||
var geometry interface{}
|
||||
if err := json.Unmarshal(s.Geometry, &geometry); err == nil {
|
||||
features = append(features, map[string]interface{}{
|
||||
"type": "Feature",
|
||||
"geometry": geometry,
|
||||
"properties": map[string]interface{}{
|
||||
"id_0": s.ID0,
|
||||
"mapid": s.MapID,
|
||||
"segment_type": s.SegmentType,
|
||||
"segment_status": s.SegmentStatus,
|
||||
"id": s.ID,
|
||||
"protection_status": s.ProtectionStatus,
|
||||
},
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
c.JSON(http.StatusOK, map[string]interface{}{
|
||||
"type": "FeatureCollection",
|
||||
"features": features,
|
||||
})
|
||||
})
|
||||
}
|
||||
3
oldqc/Backend/qc/graph_connect.go:Zone.Identifier
Normal file
3
oldqc/Backend/qc/graph_connect.go:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
1
oldqc/Backend/qc/handholes.go
Normal file
1
oldqc/Backend/qc/handholes.go
Normal file
@ -0,0 +1 @@
|
||||
package qc
|
||||
3
oldqc/Backend/qc/handholes.go:Zone.Identifier
Normal file
3
oldqc/Backend/qc/handholes.go:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
202
oldqc/Backend/qc/segment_single_span.go
Normal file
202
oldqc/Backend/qc/segment_single_span.go
Normal file
@ -0,0 +1,202 @@
|
||||
package qc
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"net/http"
|
||||
"verofy-backend/models"
|
||||
|
||||
"github.com/gin-gonic/gin"
|
||||
"gorm.io/gorm"
|
||||
)
|
||||
|
||||
type SingleSpanResult struct {
|
||||
SegmentID int `json:"segment_id"`
|
||||
SegmentName string `json:"segment_name"`
|
||||
Type string `json:"type"`
|
||||
VertexCount int `json:"vertex_count"`
|
||||
IsValid bool `json:"is_valid"`
|
||||
ErrorMessage string `json:"error_message,omitempty"`
|
||||
Geometry map[string]interface{} `json:"geometry,omitempty"`
|
||||
}
|
||||
|
||||
type SingleSpanSummary struct {
|
||||
TotalAerialSegments int `json:"total_aerial_segments"`
|
||||
ValidSegments int `json:"valid_segments"`
|
||||
InvalidSegments int `json:"invalid_segments"`
|
||||
PassRate float64 `json:"pass_rate"`
|
||||
Results []SingleSpanResult `json:"results"`
|
||||
}
|
||||
|
||||
func SingleSpanRoute(router *gin.Engine, db *gorm.DB, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol string) {
|
||||
// Full single-span summary endpoint
|
||||
router.GET("/api/qc/single-span", func(c *gin.Context) {
|
||||
mapID := c.Query("map_id")
|
||||
zone := c.Query("zone")
|
||||
if mapID == "" || zone == "" {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": "map_id and zone are required"})
|
||||
return
|
||||
}
|
||||
|
||||
summary, err := CheckSingleSpan(db, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
return
|
||||
}
|
||||
|
||||
c.JSON(http.StatusOK, summary)
|
||||
})
|
||||
|
||||
// Invalid segments only endpoint
|
||||
router.GET("/api/qc/single-span/invalid", func(c *gin.Context) {
|
||||
mapID := c.Query("map_id")
|
||||
zone := c.Query("zone")
|
||||
if mapID == "" || zone == "" {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": "map_id and zone are required"})
|
||||
return
|
||||
}
|
||||
|
||||
invalid, err := GetInvalidSingleSpanSegments(db, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
return
|
||||
}
|
||||
|
||||
c.JSON(http.StatusOK, gin.H{
|
||||
"invalid_segments": invalid,
|
||||
"count": len(invalid),
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
// CheckSingleSpan validates that aerial segments have exactly 2 vertices
|
||||
func CheckSingleSpan(db *gorm.DB, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol string) (*SingleSpanSummary, error) {
|
||||
var segments []models.SegmentGeoJSON
|
||||
table := fmt.Sprintf("%s.%s", schema, segmentTable)
|
||||
|
||||
// Query aerial segments using the same pattern as graph_connect.go
|
||||
err := db.Table(table).
|
||||
Select(fmt.Sprintf("id_0, %s, segment_type, segment_status, %s, protection_status, %s, ST_AsGeoJSON(ST_Transform(geom, 4326))::json AS geometry", mapIDCol, idCol, qcFlagCol)).
|
||||
Where(fmt.Sprintf("%s = ? AND %s = ? AND LOWER(segment_type) = ?", mapIDCol, zoneCol), mapID, zone, "aerial").
|
||||
Find(&segments).Error
|
||||
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to fetch aerial segments: %w", err)
|
||||
}
|
||||
|
||||
summary := &SingleSpanSummary{
|
||||
TotalAerialSegments: len(segments),
|
||||
Results: make([]SingleSpanResult, 0, len(segments)),
|
||||
}
|
||||
|
||||
for _, segment := range segments {
|
||||
result := validateSegmentSpan(segment)
|
||||
summary.Results = append(summary.Results, result)
|
||||
|
||||
if result.IsValid {
|
||||
summary.ValidSegments++
|
||||
} else {
|
||||
summary.InvalidSegments++
|
||||
}
|
||||
}
|
||||
|
||||
// Calculate pass rate
|
||||
if summary.TotalAerialSegments > 0 {
|
||||
summary.PassRate = float64(summary.ValidSegments) / float64(summary.TotalAerialSegments) * 100
|
||||
}
|
||||
|
||||
return summary, nil
|
||||
}
|
||||
|
||||
// validateSegmentSpan checks if a segment has exactly 2 vertices
|
||||
func validateSegmentSpan(segment models.SegmentGeoJSON) SingleSpanResult {
|
||||
result := SingleSpanResult{
|
||||
SegmentID: int(segment.ID),
|
||||
Type: segment.SegmentType,
|
||||
IsValid: false,
|
||||
}
|
||||
|
||||
// Parse the geometry to count vertices
|
||||
if len(segment.Geometry) > 0 {
|
||||
vertexCount, geometry, err := countVerticesFromRawMessage(segment.Geometry)
|
||||
if err != nil {
|
||||
result.ErrorMessage = fmt.Sprintf("Failed to parse geometry: %v", err)
|
||||
return result
|
||||
}
|
||||
|
||||
result.VertexCount = vertexCount
|
||||
result.Geometry = geometry
|
||||
|
||||
// Aerial segments should have exactly 2 vertices (one span)
|
||||
if vertexCount == 2 {
|
||||
result.IsValid = true
|
||||
} else if vertexCount < 2 {
|
||||
result.ErrorMessage = fmt.Sprintf("Segment has only %d vertex(es), needs exactly 2 for a valid span", vertexCount)
|
||||
} else {
|
||||
result.ErrorMessage = fmt.Sprintf("Segment has %d vertices, should be exactly 2 for a single span (pole to pole)", vertexCount)
|
||||
}
|
||||
} else {
|
||||
result.ErrorMessage = "Segment has no geometry data"
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
// countVerticesFromRawMessage parses json.RawMessage and counts vertices
|
||||
func countVerticesFromRawMessage(geometryRaw json.RawMessage) (int, map[string]interface{}, error) {
|
||||
var geometry map[string]interface{}
|
||||
|
||||
if err := json.Unmarshal(geometryRaw, &geometry); err != nil {
|
||||
return 0, nil, fmt.Errorf("invalid GeoJSON: %w", err)
|
||||
}
|
||||
|
||||
geometryType, ok := geometry["type"].(string)
|
||||
if !ok {
|
||||
return 0, geometry, fmt.Errorf("missing or invalid geometry type")
|
||||
}
|
||||
|
||||
coordinates, ok := geometry["coordinates"]
|
||||
if !ok {
|
||||
return 0, geometry, fmt.Errorf("missing coordinates")
|
||||
}
|
||||
|
||||
var vertexCount int
|
||||
|
||||
switch geometryType {
|
||||
case "LineString":
|
||||
// LineString coordinates are [[x,y], [x,y], ...]
|
||||
if coordArray, ok := coordinates.([]interface{}); ok {
|
||||
vertexCount = len(coordArray)
|
||||
}
|
||||
case "MultiLineString":
|
||||
// MultiLineString coordinates are [[[x,y], [x,y]], [[x,y], [x,y]]]
|
||||
if coordArrays, ok := coordinates.([]interface{}); ok {
|
||||
for _, coordArray := range coordArrays {
|
||||
if lineCoords, ok := coordArray.([]interface{}); ok {
|
||||
vertexCount += len(lineCoords)
|
||||
}
|
||||
}
|
||||
}
|
||||
default:
|
||||
return 0, geometry, fmt.Errorf("unsupported geometry type: %s", geometryType)
|
||||
}
|
||||
|
||||
return vertexCount, geometry, nil
|
||||
}
|
||||
|
||||
// GetInvalidSingleSpanSegments returns only the segments that failed the check
|
||||
func GetInvalidSingleSpanSegments(db *gorm.DB, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol string) ([]SingleSpanResult, error) {
|
||||
summary, err := CheckSingleSpan(db, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
var invalid []SingleSpanResult
|
||||
for _, result := range summary.Results {
|
||||
if !result.IsValid {
|
||||
invalid = append(invalid, result)
|
||||
}
|
||||
}
|
||||
|
||||
return invalid, nil
|
||||
}
|
||||
3
oldqc/Backend/qc/segment_single_span.go:Zone.Identifier
Normal file
3
oldqc/Backend/qc/segment_single_span.go:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
125
oldqc/Backend/qc/site_connectivity.go
Normal file
125
oldqc/Backend/qc/site_connectivity.go
Normal file
@ -0,0 +1,125 @@
|
||||
package qc
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"net/http"
|
||||
"strconv"
|
||||
"verofy-backend/models"
|
||||
|
||||
"github.com/gin-gonic/gin"
|
||||
"gorm.io/gorm"
|
||||
)
|
||||
|
||||
func SiteConnectivityRoute(router *gin.Engine, db *gorm.DB, schema string) {
|
||||
router.GET("/api/qc/site-connectivity", func(c *gin.Context) {
|
||||
mapID := c.Query("map_id")
|
||||
zone := c.Query("zone")
|
||||
maxDistanceStr := c.Query("max_distance")
|
||||
|
||||
if mapID == "" {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": "map_id is required"})
|
||||
return
|
||||
}
|
||||
|
||||
// Default max distance is 50 meters
|
||||
maxDistance := 50.0
|
||||
if maxDistanceStr != "" {
|
||||
if dist, err := strconv.ParseFloat(maxDistanceStr, 64); err == nil {
|
||||
maxDistance = dist
|
||||
}
|
||||
}
|
||||
|
||||
// Get all sites for the market (and zone if specified)
|
||||
var sites []models.SitesGeoJSON
|
||||
siteQuery := db.Table(fmt.Sprintf("%s.sites", schema)).
|
||||
Select("gid, id, mapprojectid, name, address1, city, state, zip, ST_AsGeoJSON(geom)::json AS geometry")
|
||||
|
||||
if mapID != "" {
|
||||
siteQuery = siteQuery.Where("mapprojectid = ?", mapID)
|
||||
}
|
||||
|
||||
siteQuery.Find(&sites)
|
||||
|
||||
// Get all segments for connectivity analysis
|
||||
var segments []models.SegmentGeoJSON
|
||||
segmentQuery := db.Table(fmt.Sprintf("%s.segment2", schema)).
|
||||
Select("id_0, mapid, segment_type, segment_status, id, protection_status, qc_flag, ST_AsGeoJSON(geom)::json AS geometry")
|
||||
|
||||
if mapID != "" {
|
||||
segmentQuery = segmentQuery.Where("mapid = ?", mapID)
|
||||
}
|
||||
if zone != "" {
|
||||
segmentQuery = segmentQuery.Where("group_1 = ?", zone)
|
||||
}
|
||||
|
||||
segmentQuery.Find(&segments)
|
||||
|
||||
// Analyze connectivity for each site
|
||||
results := []map[string]interface{}{}
|
||||
connectedCount := 0
|
||||
disconnectedCount := 0
|
||||
|
||||
for _, site := range sites {
|
||||
if len(site.Geometry) == 0 {
|
||||
continue
|
||||
}
|
||||
|
||||
// Use PostGIS to find the nearest segment within max distance
|
||||
var nearestDistance float64
|
||||
nearestQuery := fmt.Sprintf(`
|
||||
SELECT ST_Distance(
|
||||
ST_Transform(sites.geom, 3857),
|
||||
ST_Transform(segments.geom, 3857)
|
||||
) as distance
|
||||
FROM %s.sites sites, %s.segment2 segments
|
||||
WHERE sites.gid = ? AND segments.mapid = ?
|
||||
ORDER BY ST_Distance(
|
||||
ST_Transform(sites.geom, 3857),
|
||||
ST_Transform(segments.geom, 3857)
|
||||
)
|
||||
LIMIT 1
|
||||
`, schema, schema)
|
||||
|
||||
db.Raw(nearestQuery, site.GID, mapID).Scan(&nearestDistance)
|
||||
|
||||
isConnected := nearestDistance <= maxDistance
|
||||
status := "connected"
|
||||
if !isConnected {
|
||||
status = "disconnected"
|
||||
disconnectedCount++
|
||||
} else {
|
||||
connectedCount++
|
||||
}
|
||||
|
||||
// Update the site's connectivity status in the database
|
||||
updateQuery := fmt.Sprintf("UPDATE %s.sites SET connectivity_status = ?, connectivity_distance = ? WHERE gid = ?", schema)
|
||||
db.Exec(updateQuery, status, nearestDistance, site.GID)
|
||||
|
||||
siteResult := map[string]interface{}{
|
||||
"site_id": site.GID,
|
||||
"site_name": site.Name,
|
||||
"mapprojectid": site.MapProjectID,
|
||||
"is_connected": isConnected,
|
||||
"nearest_distance": nearestDistance,
|
||||
"connectivity_status": status,
|
||||
"geometry": site.Geometry,
|
||||
"address": site.Address1,
|
||||
"city": site.City,
|
||||
"state": site.State,
|
||||
}
|
||||
|
||||
results = append(results, siteResult)
|
||||
}
|
||||
|
||||
response := map[string]interface{}{
|
||||
"total_sites": len(sites),
|
||||
"connected_sites": connectedCount,
|
||||
"disconnected_sites": disconnectedCount,
|
||||
"connectivity_rate": float64(connectedCount) / float64(len(sites)) * 100,
|
||||
"max_distance_meters": maxDistance,
|
||||
"results": results,
|
||||
}
|
||||
|
||||
c.JSON(http.StatusOK, response)
|
||||
})
|
||||
}
|
||||
3
oldqc/Backend/qc/site_connectivity.go:Zone.Identifier
Normal file
3
oldqc/Backend/qc/site_connectivity.go:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
412
oldqc/Backend/qc/underground_endpoints.go
Normal file
412
oldqc/Backend/qc/underground_endpoints.go
Normal file
@ -0,0 +1,412 @@
|
||||
package qc
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"net/http"
|
||||
"verofy-backend/models"
|
||||
|
||||
"github.com/gin-gonic/gin"
|
||||
"gorm.io/gorm"
|
||||
)
|
||||
|
||||
type UndergroundEndpointResult struct {
|
||||
SegmentID int `json:"segment_id"`
|
||||
SegmentName string `json:"segment_name"`
|
||||
Type string `json:"type"`
|
||||
IsValid bool `json:"is_valid"`
|
||||
ErrorMessage string `json:"error_message,omitempty"`
|
||||
StartEndpoint string `json:"start_endpoint,omitempty"`
|
||||
EndEndpoint string `json:"end_endpoint,omitempty"`
|
||||
Geometry map[string]interface{} `json:"geometry,omitempty"`
|
||||
}
|
||||
|
||||
type UndergroundEndpointSummary struct {
|
||||
TotalUndergroundSegments int `json:"total_underground_segments"`
|
||||
ValidSegments int `json:"valid_segments"`
|
||||
InvalidSegments int `json:"invalid_segments"`
|
||||
PassRate float64 `json:"pass_rate"`
|
||||
Results []UndergroundEndpointResult `json:"results"`
|
||||
}
|
||||
|
||||
func UndergroundEndpointsRoute(router *gin.Engine, db *gorm.DB, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol string) {
|
||||
// Full underground endpoints summary endpoint
|
||||
router.GET("/api/qc/underground-endpoints", func(c *gin.Context) {
|
||||
mapID := c.Query("map_id")
|
||||
zone := c.Query("zone")
|
||||
if mapID == "" || zone == "" {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": "map_id and zone are required"})
|
||||
return
|
||||
}
|
||||
|
||||
summary, err := CheckUndergroundEndpoints(db, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
return
|
||||
}
|
||||
|
||||
c.JSON(http.StatusOK, summary)
|
||||
})
|
||||
|
||||
// Invalid segments only endpoint
|
||||
router.GET("/api/qc/underground-endpoints/invalid", func(c *gin.Context) {
|
||||
mapID := c.Query("map_id")
|
||||
zone := c.Query("zone")
|
||||
if mapID == "" || zone == "" {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": "map_id and zone are required"})
|
||||
return
|
||||
}
|
||||
|
||||
invalid, err := GetInvalidUndergroundEndpoints(db, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
return
|
||||
}
|
||||
|
||||
c.JSON(http.StatusOK, gin.H{
|
||||
"invalid_segments": invalid,
|
||||
"count": len(invalid),
|
||||
})
|
||||
})
|
||||
|
||||
// Update QC flags endpoint
|
||||
router.POST("/api/qc/underground-endpoints/update-flags", func(c *gin.Context) {
|
||||
var request struct {
|
||||
SegmentIDs []int `json:"segment_ids"`
|
||||
MapID string `json:"map_id"`
|
||||
Zone string `json:"zone"`
|
||||
}
|
||||
|
||||
if err := c.ShouldBindJSON(&request); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
return
|
||||
}
|
||||
|
||||
err := UpdateUndergroundEndpointFlags(db, request.SegmentIDs, schema, segmentTable, idCol, qcFlagCol)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
return
|
||||
}
|
||||
|
||||
c.JSON(http.StatusOK, gin.H{"message": fmt.Sprintf("Updated QC flags for %d segments", len(request.SegmentIDs))})
|
||||
})
|
||||
}
|
||||
|
||||
// CheckUndergroundEndpoints validates that underground segments have poles or access points at both endpoints
|
||||
func CheckUndergroundEndpoints(db *gorm.DB, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol string) (*UndergroundEndpointSummary, error) {
|
||||
var segments []models.SegmentGeoJSON
|
||||
table := fmt.Sprintf("%s.%s", schema, segmentTable)
|
||||
|
||||
// Query underground segments
|
||||
err := db.Table(table).
|
||||
Select(fmt.Sprintf("id_0, %s, segment_type, segment_status, %s, protection_status, %s, ST_AsGeoJSON(geom)::json AS geometry", mapIDCol, idCol, qcFlagCol)).
|
||||
Where(fmt.Sprintf("%s = ? AND %s = ? AND LOWER(segment_type) = ?", mapIDCol, zoneCol), mapID, zone, "underground").
|
||||
Find(&segments).Error
|
||||
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to fetch underground segments: %w", err)
|
||||
}
|
||||
|
||||
// Get poles and access points for the same map/zone
|
||||
poles, err := getPoles(db, mapID, schema)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to fetch poles: %w", err)
|
||||
}
|
||||
|
||||
accessPoints, err := getAccessPoints(db, mapID, schema)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to fetch access points: %w", err)
|
||||
}
|
||||
|
||||
summary := &UndergroundEndpointSummary{
|
||||
TotalUndergroundSegments: len(segments),
|
||||
Results: make([]UndergroundEndpointResult, 0, len(segments)),
|
||||
}
|
||||
|
||||
for _, segment := range segments {
|
||||
result := validateUndergroundEndpoints(segment, poles, accessPoints)
|
||||
summary.Results = append(summary.Results, result)
|
||||
|
||||
if result.IsValid {
|
||||
summary.ValidSegments++
|
||||
} else {
|
||||
summary.InvalidSegments++
|
||||
}
|
||||
}
|
||||
|
||||
// Calculate pass rate
|
||||
if summary.TotalUndergroundSegments > 0 {
|
||||
summary.PassRate = float64(summary.ValidSegments) / float64(summary.TotalUndergroundSegments) * 100
|
||||
}
|
||||
|
||||
return summary, nil
|
||||
}
|
||||
|
||||
// validateUndergroundEndpoints checks if underground segment has poles/access points at both ends
|
||||
func validateUndergroundEndpoints(segment models.SegmentGeoJSON, poles []models.PolesGeoJSON, accessPoints []models.AccessPointGeoJSON) UndergroundEndpointResult {
|
||||
result := UndergroundEndpointResult{
|
||||
SegmentID: int(segment.ID),
|
||||
Type: segment.SegmentType,
|
||||
IsValid: false,
|
||||
}
|
||||
|
||||
// Parse the geometry to get start and end coordinates
|
||||
if len(segment.Geometry) == 0 {
|
||||
result.ErrorMessage = "Segment has no geometry data"
|
||||
return result
|
||||
}
|
||||
|
||||
startCoord, endCoord, geometry, err := getSegmentEndpoints(segment.Geometry)
|
||||
if err != nil {
|
||||
result.ErrorMessage = fmt.Sprintf("Failed to parse geometry: %v", err)
|
||||
return result
|
||||
}
|
||||
|
||||
result.Geometry = geometry
|
||||
|
||||
// Check for poles/access points near start and end coordinates
|
||||
// Using a buffer distance of ~10 meters (0.0001 degrees approximately)
|
||||
bufferDistance := 0.0001
|
||||
|
||||
startHasEndpoint := hasEndpointNearCoordinate(startCoord, poles, accessPoints, bufferDistance)
|
||||
endHasEndpoint := hasEndpointNearCoordinate(endCoord, poles, accessPoints, bufferDistance)
|
||||
|
||||
startEndpointType := getEndpointTypeNearCoordinate(startCoord, poles, accessPoints, bufferDistance)
|
||||
endEndpointType := getEndpointTypeNearCoordinate(endCoord, poles, accessPoints, bufferDistance)
|
||||
|
||||
result.StartEndpoint = startEndpointType
|
||||
result.EndEndpoint = endEndpointType
|
||||
|
||||
if startHasEndpoint && endHasEndpoint {
|
||||
result.IsValid = true
|
||||
} else {
|
||||
errorParts := []string{}
|
||||
if !startHasEndpoint {
|
||||
errorParts = append(errorParts, "no pole/access point at start")
|
||||
}
|
||||
if !endHasEndpoint {
|
||||
errorParts = append(errorParts, "no pole/access point at end")
|
||||
}
|
||||
result.ErrorMessage = fmt.Sprintf("Underground segment missing endpoints: %s", fmt.Sprintf("%s", errorParts))
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
// getSegmentEndpoints extracts start and end coordinates from segment geometry
|
||||
func getSegmentEndpoints(geometryRaw json.RawMessage) ([2]float64, [2]float64, map[string]interface{}, error) {
|
||||
var geometry map[string]interface{}
|
||||
|
||||
if err := json.Unmarshal(geometryRaw, &geometry); err != nil {
|
||||
return [2]float64{}, [2]float64{}, nil, fmt.Errorf("invalid GeoJSON: %w", err)
|
||||
}
|
||||
|
||||
geometryType, ok := geometry["type"].(string)
|
||||
if !ok {
|
||||
return [2]float64{}, [2]float64{}, geometry, fmt.Errorf("missing or invalid geometry type")
|
||||
}
|
||||
|
||||
coordinates, ok := geometry["coordinates"]
|
||||
if !ok {
|
||||
return [2]float64{}, [2]float64{}, geometry, fmt.Errorf("missing coordinates")
|
||||
}
|
||||
|
||||
var startCoord, endCoord [2]float64
|
||||
|
||||
switch geometryType {
|
||||
case "LineString":
|
||||
// LineString coordinates are [[x,y], [x,y], ...]
|
||||
if coordArray, ok := coordinates.([]interface{}); ok && len(coordArray) >= 2 {
|
||||
if startPoint, ok := coordArray[0].([]interface{}); ok && len(startPoint) >= 2 {
|
||||
if x, ok := startPoint[0].(float64); ok {
|
||||
startCoord[0] = x
|
||||
}
|
||||
if y, ok := startPoint[1].(float64); ok {
|
||||
startCoord[1] = y
|
||||
}
|
||||
}
|
||||
if endPoint, ok := coordArray[len(coordArray)-1].([]interface{}); ok && len(endPoint) >= 2 {
|
||||
if x, ok := endPoint[0].(float64); ok {
|
||||
endCoord[0] = x
|
||||
}
|
||||
if y, ok := endPoint[1].(float64); ok {
|
||||
endCoord[1] = y
|
||||
}
|
||||
}
|
||||
} else {
|
||||
return [2]float64{}, [2]float64{}, geometry, fmt.Errorf("invalid LineString coordinates")
|
||||
}
|
||||
case "MultiLineString":
|
||||
// For MultiLineString, use first and last coordinates of the entire geometry
|
||||
if coordArrays, ok := coordinates.([]interface{}); ok && len(coordArrays) > 0 {
|
||||
// Get start from first LineString
|
||||
if firstLine, ok := coordArrays[0].([]interface{}); ok && len(firstLine) >= 2 {
|
||||
if startPoint, ok := firstLine[0].([]interface{}); ok && len(startPoint) >= 2 {
|
||||
if x, ok := startPoint[0].(float64); ok {
|
||||
startCoord[0] = x
|
||||
}
|
||||
if y, ok := startPoint[1].(float64); ok {
|
||||
startCoord[1] = y
|
||||
}
|
||||
}
|
||||
}
|
||||
// Get end from last LineString
|
||||
if lastLine, ok := coordArrays[len(coordArrays)-1].([]interface{}); ok && len(lastLine) >= 2 {
|
||||
if endPoint, ok := lastLine[len(lastLine)-1].([]interface{}); ok && len(endPoint) >= 2 {
|
||||
if x, ok := endPoint[0].(float64); ok {
|
||||
endCoord[0] = x
|
||||
}
|
||||
if y, ok := endPoint[1].(float64); ok {
|
||||
endCoord[1] = y
|
||||
}
|
||||
}
|
||||
}
|
||||
} else {
|
||||
return [2]float64{}, [2]float64{}, geometry, fmt.Errorf("invalid MultiLineString coordinates")
|
||||
}
|
||||
default:
|
||||
return [2]float64{}, [2]float64{}, geometry, fmt.Errorf("unsupported geometry type: %s", geometryType)
|
||||
}
|
||||
|
||||
return startCoord, endCoord, geometry, nil
|
||||
}
|
||||
|
||||
// hasEndpointNearCoordinate checks if there's a pole or access point within buffer distance of coordinate
|
||||
func hasEndpointNearCoordinate(coord [2]float64, poles []models.PolesGeoJSON, accessPoints []models.AccessPointGeoJSON, buffer float64) bool {
|
||||
// Check poles
|
||||
for _, pole := range poles {
|
||||
if poleCoord, err := getPointCoordinates(pole.Geometry); err == nil {
|
||||
if distance(coord, poleCoord) <= buffer {
|
||||
return true
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Check access points
|
||||
for _, ap := range accessPoints {
|
||||
if apCoord, err := getPointCoordinates(ap.Geometry); err == nil {
|
||||
if distance(coord, apCoord) <= buffer {
|
||||
return true
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return false
|
||||
}
|
||||
|
||||
// getEndpointTypeNearCoordinate returns the type of endpoint near the coordinate
|
||||
func getEndpointTypeNearCoordinate(coord [2]float64, poles []models.PolesGeoJSON, accessPoints []models.AccessPointGeoJSON, buffer float64) string {
|
||||
// Check poles first
|
||||
for _, pole := range poles {
|
||||
if poleCoord, err := getPointCoordinates(pole.Geometry); err == nil {
|
||||
if distance(coord, poleCoord) <= buffer {
|
||||
return fmt.Sprintf("Pole (ID: %d)", *pole.ID)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Check access points
|
||||
for _, ap := range accessPoints {
|
||||
if apCoord, err := getPointCoordinates(ap.Geometry); err == nil {
|
||||
if distance(coord, apCoord) <= buffer {
|
||||
return fmt.Sprintf("Access Point (ID: %d)", *ap.ID)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return "None"
|
||||
}
|
||||
|
||||
// getPointCoordinates extracts coordinates from point geometry
|
||||
func getPointCoordinates(geometryRaw json.RawMessage) ([2]float64, error) {
|
||||
var geometry map[string]interface{}
|
||||
|
||||
if err := json.Unmarshal(geometryRaw, &geometry); err != nil {
|
||||
return [2]float64{}, fmt.Errorf("invalid GeoJSON: %w", err)
|
||||
}
|
||||
|
||||
geometryType, ok := geometry["type"].(string)
|
||||
if !ok || geometryType != "Point" {
|
||||
return [2]float64{}, fmt.Errorf("not a Point geometry")
|
||||
}
|
||||
|
||||
coordinates, ok := geometry["coordinates"].([]interface{})
|
||||
if !ok || len(coordinates) < 2 {
|
||||
return [2]float64{}, fmt.Errorf("invalid Point coordinates")
|
||||
}
|
||||
|
||||
var coord [2]float64
|
||||
if x, ok := coordinates[0].(float64); ok {
|
||||
coord[0] = x
|
||||
}
|
||||
if y, ok := coordinates[1].(float64); ok {
|
||||
coord[1] = y
|
||||
}
|
||||
|
||||
return coord, nil
|
||||
}
|
||||
|
||||
// distance calculates simple Euclidean distance between two coordinates
|
||||
func distance(a, b [2]float64) float64 {
|
||||
dx := a[0] - b[0]
|
||||
dy := a[1] - b[1]
|
||||
return dx*dx + dy*dy // Using squared distance for efficiency
|
||||
}
|
||||
|
||||
// getPoles fetches poles for the given map ID
|
||||
func getPoles(db *gorm.DB, mapID, schema string) ([]models.PolesGeoJSON, error) {
|
||||
var poles []models.PolesGeoJSON
|
||||
table := fmt.Sprintf("%s.poles", schema)
|
||||
|
||||
err := db.Table(table).
|
||||
Select("gid, id, mapprojectid, name, tags, group1, group2, owner, poleheight, attachmentheight, ST_AsGeoJSON(geom)::json AS geometry").
|
||||
Where("mapprojectid = ?", mapID).
|
||||
Find(&poles).Error
|
||||
|
||||
return poles, err
|
||||
}
|
||||
|
||||
// getAccessPoints fetches access points for the given map ID
|
||||
func getAccessPoints(db *gorm.DB, mapID, schema string) ([]models.AccessPointGeoJSON, error) {
|
||||
var accessPoints []models.AccessPointGeoJSON
|
||||
table := fmt.Sprintf("%s.access_points", schema)
|
||||
|
||||
err := db.Table(table).
|
||||
Select(`gid, id, name, mapprojectid, latitude, longitude, manufacturer, size, locked, description, aka,
|
||||
createdby, createddate, modifiedby, modifieddate, historyid, group1, group2, typeid, statusid,
|
||||
crmvendorid, billdate, ST_AsGeoJSON(geom)::json AS geometry`).
|
||||
Where("mapprojectid = ?", mapID).
|
||||
Find(&accessPoints).Error
|
||||
|
||||
return accessPoints, err
|
||||
}
|
||||
|
||||
// GetInvalidUndergroundEndpoints returns only the segments that failed the check
|
||||
func GetInvalidUndergroundEndpoints(db *gorm.DB, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol string) ([]UndergroundEndpointResult, error) {
|
||||
summary, err := CheckUndergroundEndpoints(db, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
var invalid []UndergroundEndpointResult
|
||||
for _, result := range summary.Results {
|
||||
if !result.IsValid {
|
||||
invalid = append(invalid, result)
|
||||
}
|
||||
}
|
||||
|
||||
return invalid, nil
|
||||
}
|
||||
|
||||
// UpdateUndergroundEndpointFlags updates QC flags for invalid segments
|
||||
func UpdateUndergroundEndpointFlags(db *gorm.DB, segmentIDs []int, schema, segmentTable, idCol, qcFlagCol string) error {
|
||||
if len(segmentIDs) == 0 {
|
||||
return nil
|
||||
}
|
||||
|
||||
table := fmt.Sprintf("%s.%s", schema, segmentTable)
|
||||
|
||||
return db.Table(table).
|
||||
Where(fmt.Sprintf("%s IN ?", idCol), segmentIDs).
|
||||
Update(qcFlagCol, "underground_endpoint_issue").Error
|
||||
}
|
||||
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
743
oldqc/Backend/qc/zone_containment.go
Normal file
743
oldqc/Backend/qc/zone_containment.go
Normal file
@ -0,0 +1,743 @@
|
||||
package qc
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"net/http"
|
||||
"strings"
|
||||
"verofy-backend/models"
|
||||
|
||||
"github.com/gin-gonic/gin"
|
||||
"gorm.io/gorm"
|
||||
)
|
||||
|
||||
type ZoneContainmentResult struct {
|
||||
ElementID int `json:"element_id"`
|
||||
ElementType string `json:"element_type"` // "segment", "site", "pole", "access_point"
|
||||
ElementName string `json:"element_name,omitempty"`
|
||||
AssignedZone *string `json:"assigned_zone"`
|
||||
ActualZones []string `json:"actual_zones,omitempty"`
|
||||
IsValid bool `json:"is_valid"`
|
||||
ErrorMessage string `json:"error_message,omitempty"`
|
||||
Geometry map[string]interface{} `json:"geometry,omitempty"`
|
||||
}
|
||||
|
||||
type ZoneContainmentSummary struct {
|
||||
TotalElements int `json:"total_elements"`
|
||||
ValidElements int `json:"valid_elements"`
|
||||
InvalidElements int `json:"invalid_elements"`
|
||||
PassRate float64 `json:"pass_rate"`
|
||||
ByType map[string]TypeSummary `json:"by_type"`
|
||||
Results []ZoneContainmentResult `json:"results"`
|
||||
}
|
||||
|
||||
type TypeSummary struct {
|
||||
Total int `json:"total"`
|
||||
Valid int `json:"valid"`
|
||||
Invalid int `json:"invalid"`
|
||||
}
|
||||
|
||||
func ZoneContainmentRoute(router *gin.Engine, db *gorm.DB, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol string) {
|
||||
// Full zone containment summary endpoint
|
||||
router.GET("/api/qc/zone-containment", func(c *gin.Context) {
|
||||
mapID := c.Query("map_id")
|
||||
zone := c.Query("zone")
|
||||
if mapID == "" || zone == "" {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": "map_id and zone are required"})
|
||||
return
|
||||
}
|
||||
|
||||
summary, err := CheckZoneContainment(db, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
return
|
||||
}
|
||||
|
||||
c.JSON(http.StatusOK, summary)
|
||||
})
|
||||
|
||||
// Invalid elements only endpoint
|
||||
router.GET("/api/qc/zone-containment/invalid", func(c *gin.Context) {
|
||||
mapID := c.Query("map_id")
|
||||
zone := c.Query("zone")
|
||||
if mapID == "" || zone == "" {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": "map_id and zone are required"})
|
||||
return
|
||||
}
|
||||
|
||||
invalid, err := GetInvalidZoneContainment(db, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
return
|
||||
}
|
||||
|
||||
c.JSON(http.StatusOK, gin.H{
|
||||
"invalid_elements": invalid,
|
||||
"count": len(invalid),
|
||||
})
|
||||
})
|
||||
|
||||
// Update QC flags endpoint (for segments)
|
||||
router.POST("/api/qc/zone-containment/update-flags", func(c *gin.Context) {
|
||||
var request struct {
|
||||
SegmentIDs []int `json:"segment_ids"`
|
||||
MapID string `json:"map_id"`
|
||||
Zone string `json:"zone"`
|
||||
}
|
||||
|
||||
if err := c.ShouldBindJSON(&request); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
return
|
||||
}
|
||||
|
||||
err := UpdateZoneContainmentFlags(db, request.SegmentIDs, schema, segmentTable, idCol, qcFlagCol)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
return
|
||||
}
|
||||
|
||||
c.JSON(http.StatusOK, gin.H{"message": fmt.Sprintf("Updated QC flags for %d segments", len(request.SegmentIDs))})
|
||||
})
|
||||
}
|
||||
|
||||
// CheckZoneContainment validates that network elements are within their assigned zones
|
||||
func CheckZoneContainment(db *gorm.DB, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol string) (*ZoneContainmentSummary, error) {
|
||||
summary := &ZoneContainmentSummary{
|
||||
Results: make([]ZoneContainmentResult, 0),
|
||||
ByType: map[string]TypeSummary{
|
||||
"segment": {Total: 0, Valid: 0, Invalid: 0},
|
||||
"site": {Total: 0, Valid: 0, Invalid: 0},
|
||||
"pole": {Total: 0, Valid: 0, Invalid: 0},
|
||||
"access_point": {Total: 0, Valid: 0, Invalid: 0},
|
||||
},
|
||||
}
|
||||
|
||||
// Get all zone polygons
|
||||
zones, err := getZonePolygons(db, schema)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to fetch zone polygons: %w", err)
|
||||
}
|
||||
|
||||
// Check segments
|
||||
segmentResults, err := checkSegmentZones(db, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol, zones)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to check segments: %w", err)
|
||||
}
|
||||
summary.Results = append(summary.Results, segmentResults...)
|
||||
|
||||
// Check sites
|
||||
siteResults, err := checkSiteZones(db, mapID, schema, zones)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to check sites: %w", err)
|
||||
}
|
||||
summary.Results = append(summary.Results, siteResults...)
|
||||
|
||||
// Check poles
|
||||
poleResults, err := checkPoleZones(db, mapID, schema, zones)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to check poles: %w", err)
|
||||
}
|
||||
summary.Results = append(summary.Results, poleResults...)
|
||||
|
||||
// Check access points
|
||||
accessPointResults, err := checkAccessPointZones(db, mapID, schema, zones)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to check access points: %w", err)
|
||||
}
|
||||
summary.Results = append(summary.Results, accessPointResults...)
|
||||
|
||||
// Calculate summary statistics
|
||||
for _, result := range summary.Results {
|
||||
typeSummary := summary.ByType[result.ElementType]
|
||||
typeSummary.Total++
|
||||
if result.IsValid {
|
||||
typeSummary.Valid++
|
||||
summary.ValidElements++
|
||||
} else {
|
||||
typeSummary.Invalid++
|
||||
summary.InvalidElements++
|
||||
}
|
||||
summary.ByType[result.ElementType] = typeSummary
|
||||
summary.TotalElements++
|
||||
}
|
||||
|
||||
// Calculate pass rate
|
||||
if summary.TotalElements > 0 {
|
||||
summary.PassRate = float64(summary.ValidElements) / float64(summary.TotalElements) * 100
|
||||
}
|
||||
|
||||
return summary, nil
|
||||
}
|
||||
|
||||
// getZonePolygons fetches all zone polygons from the info table
|
||||
func getZonePolygons(db *gorm.DB, schema string) ([]models.InfoGeoJSON, error) {
|
||||
var zones []models.InfoGeoJSON
|
||||
table := fmt.Sprintf("%s.info", schema)
|
||||
|
||||
err := db.Table(table).
|
||||
Select("id, name, group_1, ST_AsGeoJSON(geom)::json AS geometry").
|
||||
Find(&zones).Error
|
||||
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return zones, nil
|
||||
}
|
||||
|
||||
// checkSegmentZones validates segments against their assigned zones using PostGIS
|
||||
func checkSegmentZones(db *gorm.DB, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol string, zones []models.InfoGeoJSON) ([]ZoneContainmentResult, error) {
|
||||
// Use PostGIS to check intersection directly in the database
|
||||
type SegmentZoneCheck struct {
|
||||
ID int `gorm:"column:id"`
|
||||
SegmentType string `gorm:"column:segment_type"`
|
||||
AssignedZone *string `gorm:"column:assigned_zone"`
|
||||
ActualZones string `gorm:"column:actual_zones"`
|
||||
Geometry json.RawMessage `gorm:"column:geometry"`
|
||||
}
|
||||
|
||||
var results []SegmentZoneCheck
|
||||
table := fmt.Sprintf("%s.%s", schema, segmentTable)
|
||||
infoTable := fmt.Sprintf("%s.info", schema)
|
||||
|
||||
query := fmt.Sprintf(`
|
||||
SELECT
|
||||
s.%s as id,
|
||||
s.segment_type,
|
||||
s."%s" as assigned_zone,
|
||||
STRING_AGG(i.group_1, ',') as actual_zones,
|
||||
ST_AsGeoJSON(ST_Transform(s.geom, 4326))::json AS geometry
|
||||
FROM %s s
|
||||
LEFT JOIN %s i ON ST_Intersects(ST_Transform(s.geom, 4326), i.geom)
|
||||
WHERE s.%s = ? AND s."%s" = ?
|
||||
GROUP BY s.%s, s.segment_type, s."%s", s.geom
|
||||
`, idCol, zoneCol, table, infoTable, mapIDCol, zoneCol, idCol, zoneCol)
|
||||
|
||||
err := db.Raw(query, mapID, zone).Scan(&results).Error
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
qcResults := make([]ZoneContainmentResult, 0, len(results))
|
||||
|
||||
for _, seg := range results {
|
||||
result := ZoneContainmentResult{
|
||||
ElementID: seg.ID,
|
||||
ElementType: "segment",
|
||||
ElementName: seg.SegmentType,
|
||||
AssignedZone: seg.AssignedZone,
|
||||
Geometry: parseGeometryToMap(seg.Geometry),
|
||||
}
|
||||
|
||||
// Parse actual zones
|
||||
if seg.ActualZones != "" {
|
||||
result.ActualZones = splitZones(seg.ActualZones)
|
||||
} else {
|
||||
result.ActualZones = []string{}
|
||||
}
|
||||
|
||||
// Check validity
|
||||
if seg.AssignedZone == nil || *seg.AssignedZone == "" {
|
||||
result.IsValid = false
|
||||
result.ErrorMessage = "Element has no assigned zone (NULL or blank)"
|
||||
} else if len(result.ActualZones) == 0 {
|
||||
result.IsValid = false
|
||||
result.ErrorMessage = fmt.Sprintf("Element assigned to '%s' but not found in any zone", *seg.AssignedZone)
|
||||
} else {
|
||||
// Check if assigned zone is in actual zones
|
||||
result.IsValid = false
|
||||
for _, actualZone := range result.ActualZones {
|
||||
if actualZone == *seg.AssignedZone {
|
||||
result.IsValid = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !result.IsValid {
|
||||
result.ErrorMessage = fmt.Sprintf("Element assigned to '%s' but found in: %v", *seg.AssignedZone, result.ActualZones)
|
||||
}
|
||||
}
|
||||
|
||||
qcResults = append(qcResults, result)
|
||||
}
|
||||
|
||||
return qcResults, nil
|
||||
}
|
||||
|
||||
// Helper function to split comma-separated zones
|
||||
func splitZones(zones string) []string {
|
||||
if zones == "" {
|
||||
return []string{}
|
||||
}
|
||||
parts := []string{}
|
||||
for _, z := range strings.Split(zones, ",") {
|
||||
z = strings.TrimSpace(z)
|
||||
if z != "" {
|
||||
parts = append(parts, z)
|
||||
}
|
||||
}
|
||||
return parts
|
||||
}
|
||||
|
||||
// Helper to parse geometry JSON to map
|
||||
func parseGeometryToMap(geomJSON json.RawMessage) map[string]interface{} {
|
||||
var geomMap map[string]interface{}
|
||||
if err := json.Unmarshal(geomJSON, &geomMap); err != nil {
|
||||
return nil
|
||||
}
|
||||
return geomMap
|
||||
}
|
||||
|
||||
// checkSiteZones validates sites against their assigned zones using PostGIS
|
||||
func checkSiteZones(db *gorm.DB, mapID, schema string, zones []models.InfoGeoJSON) ([]ZoneContainmentResult, error) {
|
||||
type SiteZoneCheck struct {
|
||||
ID int `gorm:"column:id"`
|
||||
Name *string `gorm:"column:name"`
|
||||
AssignedZone *string `gorm:"column:assigned_zone"`
|
||||
ActualZones string `gorm:"column:actual_zones"`
|
||||
Geometry json.RawMessage `gorm:"column:geometry"`
|
||||
}
|
||||
|
||||
var results []SiteZoneCheck
|
||||
table := fmt.Sprintf("%s.sites", schema)
|
||||
infoTable := fmt.Sprintf("%s.info", schema)
|
||||
|
||||
query := fmt.Sprintf(`
|
||||
SELECT
|
||||
COALESCE(s.id, s.gid) as id,
|
||||
s."Name" as name,
|
||||
s."Group 1" as assigned_zone,
|
||||
STRING_AGG(i.group_1, ',') as actual_zones,
|
||||
ST_AsGeoJSON(s.geometry)::json AS geometry
|
||||
FROM %s s
|
||||
LEFT JOIN %s i ON ST_Within(s.geometry, i.geom)
|
||||
WHERE s."MapProjectID" = ?
|
||||
GROUP BY s.gid, s.id, s."Name", s."Group 1", s.geometry
|
||||
`, table, infoTable)
|
||||
|
||||
err := db.Raw(query, mapID).Scan(&results).Error
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
qcResults := make([]ZoneContainmentResult, 0, len(results))
|
||||
|
||||
for _, site := range results {
|
||||
result := ZoneContainmentResult{
|
||||
ElementID: site.ID,
|
||||
ElementType: "site",
|
||||
ElementName: "",
|
||||
AssignedZone: site.AssignedZone,
|
||||
Geometry: parseGeometryToMap(site.Geometry),
|
||||
}
|
||||
|
||||
if site.Name != nil {
|
||||
result.ElementName = *site.Name
|
||||
}
|
||||
|
||||
// Parse actual zones
|
||||
if site.ActualZones != "" {
|
||||
result.ActualZones = splitZones(site.ActualZones)
|
||||
} else {
|
||||
result.ActualZones = []string{}
|
||||
}
|
||||
|
||||
// Check validity
|
||||
if site.AssignedZone == nil || *site.AssignedZone == "" {
|
||||
result.IsValid = false
|
||||
result.ErrorMessage = "Element has no assigned zone (NULL or blank)"
|
||||
} else if len(result.ActualZones) == 0 {
|
||||
result.IsValid = false
|
||||
result.ErrorMessage = fmt.Sprintf("Element assigned to '%s' but not found in any zone", *site.AssignedZone)
|
||||
} else {
|
||||
// Check if assigned zone is in actual zones
|
||||
result.IsValid = false
|
||||
for _, actualZone := range result.ActualZones {
|
||||
if actualZone == *site.AssignedZone {
|
||||
result.IsValid = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !result.IsValid {
|
||||
result.ErrorMessage = fmt.Sprintf("Element assigned to '%s' but found in: %v", *site.AssignedZone, result.ActualZones)
|
||||
}
|
||||
}
|
||||
|
||||
qcResults = append(qcResults, result)
|
||||
}
|
||||
|
||||
return qcResults, nil
|
||||
}
|
||||
|
||||
// checkPoleZones validates poles against their assigned zones using PostGIS
|
||||
func checkPoleZones(db *gorm.DB, mapID, schema string, zones []models.InfoGeoJSON) ([]ZoneContainmentResult, error) {
|
||||
type PoleZoneCheck struct {
|
||||
ID int `gorm:"column:id"`
|
||||
Name *string `gorm:"column:name"`
|
||||
AssignedZone *string `gorm:"column:assigned_zone"`
|
||||
ActualZones string `gorm:"column:actual_zones"`
|
||||
Geometry json.RawMessage `gorm:"column:geometry"`
|
||||
}
|
||||
|
||||
var results []PoleZoneCheck
|
||||
table := fmt.Sprintf("%s.poles", schema)
|
||||
infoTable := fmt.Sprintf("%s.info", schema)
|
||||
|
||||
query := fmt.Sprintf(`
|
||||
SELECT
|
||||
COALESCE(p.id, p.gid) as id,
|
||||
p.name,
|
||||
p.group1 as assigned_zone,
|
||||
STRING_AGG(i.group_1, ',') as actual_zones,
|
||||
ST_AsGeoJSON(ST_Transform(p.geom, 4326))::json AS geometry
|
||||
FROM %s p
|
||||
LEFT JOIN %s i ON ST_Within(ST_Transform(p.geom, 4326), i.geom)
|
||||
WHERE p.mapprojectid = ?
|
||||
GROUP BY p.gid, p.id, p.name, p.group1, p.geom
|
||||
`, table, infoTable)
|
||||
|
||||
err := db.Raw(query, mapID).Scan(&results).Error
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
qcResults := make([]ZoneContainmentResult, 0, len(results))
|
||||
|
||||
for _, pole := range results {
|
||||
result := ZoneContainmentResult{
|
||||
ElementID: pole.ID,
|
||||
ElementType: "pole",
|
||||
ElementName: "",
|
||||
AssignedZone: pole.AssignedZone,
|
||||
Geometry: parseGeometryToMap(pole.Geometry),
|
||||
}
|
||||
|
||||
if pole.Name != nil {
|
||||
result.ElementName = *pole.Name
|
||||
}
|
||||
|
||||
// Parse actual zones
|
||||
if pole.ActualZones != "" {
|
||||
result.ActualZones = splitZones(pole.ActualZones)
|
||||
} else {
|
||||
result.ActualZones = []string{}
|
||||
}
|
||||
|
||||
// Check validity
|
||||
if pole.AssignedZone == nil || *pole.AssignedZone == "" {
|
||||
result.IsValid = false
|
||||
result.ErrorMessage = "Element has no assigned zone (NULL or blank)"
|
||||
} else if len(result.ActualZones) == 0 {
|
||||
result.IsValid = false
|
||||
result.ErrorMessage = fmt.Sprintf("Element assigned to '%s' but not found in any zone", *pole.AssignedZone)
|
||||
} else {
|
||||
// Check if assigned zone is in actual zones
|
||||
result.IsValid = false
|
||||
for _, actualZone := range result.ActualZones {
|
||||
if actualZone == *pole.AssignedZone {
|
||||
result.IsValid = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !result.IsValid {
|
||||
result.ErrorMessage = fmt.Sprintf("Element assigned to '%s' but found in: %v", *pole.AssignedZone, result.ActualZones)
|
||||
}
|
||||
}
|
||||
|
||||
qcResults = append(qcResults, result)
|
||||
}
|
||||
|
||||
return qcResults, nil
|
||||
}
|
||||
|
||||
// checkAccessPointZones validates access points against their assigned zones using PostGIS
|
||||
func checkAccessPointZones(db *gorm.DB, mapID, schema string, zones []models.InfoGeoJSON) ([]ZoneContainmentResult, error) {
|
||||
type AccessPointZoneCheck struct {
|
||||
ID int `gorm:"column:id"`
|
||||
Name *string `gorm:"column:name"`
|
||||
AssignedZone *string `gorm:"column:assigned_zone"`
|
||||
ActualZones string `gorm:"column:actual_zones"`
|
||||
Geometry json.RawMessage `gorm:"column:geometry"`
|
||||
}
|
||||
|
||||
var results []AccessPointZoneCheck
|
||||
table := fmt.Sprintf("%s.access_points", schema)
|
||||
infoTable := fmt.Sprintf("%s.info", schema)
|
||||
|
||||
query := fmt.Sprintf(`
|
||||
SELECT
|
||||
COALESCE(ap.id, ap.gid) as id,
|
||||
ap.name,
|
||||
ap.group1 as assigned_zone,
|
||||
STRING_AGG(i.group_1, ',') as actual_zones,
|
||||
ST_AsGeoJSON(ST_Transform(ap.geom, 4326))::json AS geometry
|
||||
FROM %s ap
|
||||
LEFT JOIN %s i ON ST_Within(ST_Transform(ap.geom, 4326), i.geom)
|
||||
WHERE ap.mapprojectid = ?
|
||||
GROUP BY ap.gid, ap.id, ap.name, ap.group1, ap.geom
|
||||
`, table, infoTable)
|
||||
|
||||
err := db.Raw(query, mapID).Scan(&results).Error
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
qcResults := make([]ZoneContainmentResult, 0, len(results))
|
||||
|
||||
for _, ap := range results {
|
||||
result := ZoneContainmentResult{
|
||||
ElementID: ap.ID,
|
||||
ElementType: "access_point",
|
||||
ElementName: "",
|
||||
AssignedZone: ap.AssignedZone,
|
||||
Geometry: parseGeometryToMap(ap.Geometry),
|
||||
}
|
||||
|
||||
if ap.Name != nil {
|
||||
result.ElementName = *ap.Name
|
||||
}
|
||||
|
||||
// Parse actual zones
|
||||
if ap.ActualZones != "" {
|
||||
result.ActualZones = splitZones(ap.ActualZones)
|
||||
} else {
|
||||
result.ActualZones = []string{}
|
||||
}
|
||||
|
||||
// Check validity
|
||||
if ap.AssignedZone == nil || *ap.AssignedZone == "" {
|
||||
result.IsValid = false
|
||||
result.ErrorMessage = "Element has no assigned zone (NULL or blank)"
|
||||
} else if len(result.ActualZones) == 0 {
|
||||
result.IsValid = false
|
||||
result.ErrorMessage = fmt.Sprintf("Element assigned to '%s' but not found in any zone", *ap.AssignedZone)
|
||||
} else {
|
||||
// Check if assigned zone is in actual zones
|
||||
result.IsValid = false
|
||||
for _, actualZone := range result.ActualZones {
|
||||
if actualZone == *ap.AssignedZone {
|
||||
result.IsValid = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !result.IsValid {
|
||||
result.ErrorMessage = fmt.Sprintf("Element assigned to '%s' but found in: %v", *ap.AssignedZone, result.ActualZones)
|
||||
}
|
||||
}
|
||||
|
||||
qcResults = append(qcResults, result)
|
||||
}
|
||||
|
||||
return qcResults, nil
|
||||
}
|
||||
|
||||
// validateElementZone checks if an element is within its assigned zone
|
||||
// For segments: isLineString=true, allows partial intersection
|
||||
// For points: isLineString=false, requires point to be within zone
|
||||
func validateElementZone(elementID int, elementType, elementName string, assignedZone *string, geometry json.RawMessage, zones []models.InfoGeoJSON, isLineString bool) ZoneContainmentResult {
|
||||
result := ZoneContainmentResult{
|
||||
ElementID: elementID,
|
||||
ElementType: elementType,
|
||||
ElementName: elementName,
|
||||
AssignedZone: assignedZone,
|
||||
IsValid: false,
|
||||
ActualZones: []string{},
|
||||
}
|
||||
|
||||
// Parse geometry
|
||||
var geomMap map[string]interface{}
|
||||
if err := json.Unmarshal(geometry, &geomMap); err != nil {
|
||||
result.ErrorMessage = "Failed to parse geometry"
|
||||
return result
|
||||
}
|
||||
result.Geometry = geomMap
|
||||
|
||||
// Check if assigned zone is NULL or empty - this is INVALID
|
||||
if assignedZone == nil || *assignedZone == "" {
|
||||
result.ErrorMessage = "Element has no assigned zone (NULL or blank)"
|
||||
return result
|
||||
}
|
||||
|
||||
// Find which zones contain this element
|
||||
for _, zone := range zones {
|
||||
if zone.Group1 == nil {
|
||||
continue
|
||||
}
|
||||
|
||||
if isLineString {
|
||||
// For segments (LineStrings): check if ANY part intersects with the zone
|
||||
if geometryIntersectsZone(geomMap, zone.Geometry) {
|
||||
result.ActualZones = append(result.ActualZones, *zone.Group1)
|
||||
}
|
||||
} else {
|
||||
// For points: check if point is within the zone
|
||||
if pointWithinZone(geomMap, zone.Geometry) {
|
||||
result.ActualZones = append(result.ActualZones, *zone.Group1)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Validate: assigned zone must be in the list of actual zones
|
||||
for _, actualZone := range result.ActualZones {
|
||||
if actualZone == *assignedZone {
|
||||
result.IsValid = true
|
||||
return result
|
||||
}
|
||||
}
|
||||
|
||||
// Element is not in its assigned zone
|
||||
if len(result.ActualZones) == 0 {
|
||||
result.ErrorMessage = fmt.Sprintf("Element assigned to '%s' but not found in any zone", *assignedZone)
|
||||
} else {
|
||||
result.ErrorMessage = fmt.Sprintf("Element assigned to '%s' but found in: %v", *assignedZone, result.ActualZones)
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
// geometryIntersectsZone checks if a LineString geometry intersects with a zone polygon
|
||||
func geometryIntersectsZone(lineGeom map[string]interface{}, zoneGeometry json.RawMessage) bool {
|
||||
var zonePoly map[string]interface{}
|
||||
if err := json.Unmarshal(zoneGeometry, &zonePoly); err != nil {
|
||||
return false
|
||||
}
|
||||
|
||||
// Get line coordinates
|
||||
lineCoords, ok := lineGeom["coordinates"].([]interface{})
|
||||
if !ok || len(lineCoords) == 0 {
|
||||
return false
|
||||
}
|
||||
|
||||
// Get polygon coordinates (first ring is outer boundary)
|
||||
polyCoords, ok := zonePoly["coordinates"].([]interface{})
|
||||
if !ok || len(polyCoords) == 0 {
|
||||
return false
|
||||
}
|
||||
|
||||
outerRing, ok := polyCoords[0].([]interface{})
|
||||
if !ok || len(outerRing) == 0 {
|
||||
return false
|
||||
}
|
||||
|
||||
// Check if ANY point of the line is within the polygon
|
||||
for _, coordInterface := range lineCoords {
|
||||
coord, ok := coordInterface.([]interface{})
|
||||
if !ok || len(coord) < 2 {
|
||||
continue
|
||||
}
|
||||
|
||||
lng, ok1 := coord[0].(float64)
|
||||
lat, ok2 := coord[1].(float64)
|
||||
if !ok1 || !ok2 {
|
||||
continue
|
||||
}
|
||||
|
||||
if pointInPolygon(lng, lat, outerRing) {
|
||||
return true
|
||||
}
|
||||
}
|
||||
|
||||
return false
|
||||
}
|
||||
|
||||
// pointWithinZone checks if a Point geometry is within a zone polygon
|
||||
func pointWithinZone(pointGeom map[string]interface{}, zoneGeometry json.RawMessage) bool {
|
||||
var zonePoly map[string]interface{}
|
||||
if err := json.Unmarshal(zoneGeometry, &zonePoly); err != nil {
|
||||
return false
|
||||
}
|
||||
|
||||
// Get point coordinates
|
||||
pointCoords, ok := pointGeom["coordinates"].([]interface{})
|
||||
if !ok || len(pointCoords) < 2 {
|
||||
return false
|
||||
}
|
||||
|
||||
lng, ok1 := pointCoords[0].(float64)
|
||||
lat, ok2 := pointCoords[1].(float64)
|
||||
if !ok1 || !ok2 {
|
||||
return false
|
||||
}
|
||||
|
||||
// Get polygon coordinates
|
||||
polyCoords, ok := zonePoly["coordinates"].([]interface{})
|
||||
if !ok || len(polyCoords) == 0 {
|
||||
return false
|
||||
}
|
||||
|
||||
outerRing, ok := polyCoords[0].([]interface{})
|
||||
if !ok || len(outerRing) == 0 {
|
||||
return false
|
||||
}
|
||||
|
||||
return pointInPolygon(lng, lat, outerRing)
|
||||
}
|
||||
|
||||
// pointInPolygon uses ray casting algorithm to determine if point is inside polygon
|
||||
func pointInPolygon(lng, lat float64, ring []interface{}) bool {
|
||||
inside := false
|
||||
j := len(ring) - 1
|
||||
|
||||
for i := 0; i < len(ring); i++ {
|
||||
coord, ok := ring[i].([]interface{})
|
||||
if !ok || len(coord) < 2 {
|
||||
continue
|
||||
}
|
||||
coordJ, ok := ring[j].([]interface{})
|
||||
if !ok || len(coordJ) < 2 {
|
||||
continue
|
||||
}
|
||||
|
||||
xi, ok1 := coord[0].(float64)
|
||||
yi, ok2 := coord[1].(float64)
|
||||
xj, ok3 := coordJ[0].(float64)
|
||||
yj, ok4 := coordJ[1].(float64)
|
||||
|
||||
if !ok1 || !ok2 || !ok3 || !ok4 {
|
||||
continue
|
||||
}
|
||||
|
||||
intersect := ((yi > lat) != (yj > lat)) && (lng < (xj-xi)*(lat-yi)/(yj-yi)+xi)
|
||||
if intersect {
|
||||
inside = !inside
|
||||
}
|
||||
|
||||
j = i
|
||||
}
|
||||
|
||||
return inside
|
||||
}
|
||||
|
||||
// GetInvalidZoneContainment returns only elements that failed the check
|
||||
func GetInvalidZoneContainment(db *gorm.DB, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol string) ([]ZoneContainmentResult, error) {
|
||||
summary, err := CheckZoneContainment(db, mapID, zone, schema, segmentTable, mapIDCol, zoneCol, idCol, qcFlagCol)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
var invalid []ZoneContainmentResult
|
||||
for _, result := range summary.Results {
|
||||
if !result.IsValid {
|
||||
invalid = append(invalid, result)
|
||||
}
|
||||
}
|
||||
|
||||
return invalid, nil
|
||||
}
|
||||
|
||||
// UpdateZoneContainmentFlags updates QC flags for invalid segments
|
||||
func UpdateZoneContainmentFlags(db *gorm.DB, segmentIDs []int, schema, segmentTable, idCol, qcFlagCol string) error {
|
||||
if len(segmentIDs) == 0 {
|
||||
return nil
|
||||
}
|
||||
|
||||
table := fmt.Sprintf("%s.%s", schema, segmentTable)
|
||||
|
||||
return db.Table(table).
|
||||
Where(fmt.Sprintf("%s IN ?", idCol), segmentIDs).
|
||||
Update(qcFlagCol, "zone_containment_invalid").Error
|
||||
}
|
||||
|
||||
// Helper function to get string pointer
|
||||
func getStringPointer(s string) *string {
|
||||
return &s
|
||||
}
|
||||
3
oldqc/Backend/qc/zone_containment.go:Zone.Identifier
Normal file
3
oldqc/Backend/qc/zone_containment.go:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
2
oldqc/Backend/run.bat
Normal file
2
oldqc/Backend/run.bat
Normal file
@ -0,0 +1,2 @@
|
||||
@echo off
|
||||
go build -o server.exe main.go && server.exe
|
||||
3
oldqc/Backend/run.bat:Zone.Identifier
Normal file
3
oldqc/Backend/run.bat:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
64
oldqc/CLAUDE.md
Normal file
64
oldqc/CLAUDE.md
Normal file
@ -0,0 +1,64 @@
|
||||
# CLAUDE.md
|
||||
|
||||
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
|
||||
|
||||
## Project Overview
|
||||
|
||||
Auto-LLD QC is a web-based spatial quality control tool for fiber network designs. It connects to a PostGIS database to run automated QC checks on network segments through a map interface.
|
||||
|
||||
## Commands
|
||||
|
||||
### Backend Development
|
||||
```bash
|
||||
cd Backend
|
||||
go run main.go # Start the server (default port 8080)
|
||||
go mod tidy # Clean up dependencies
|
||||
```
|
||||
|
||||
### Frontend Development
|
||||
- Frontend is static HTML/CSS/JavaScript served from `Frontend/` directory
|
||||
- Open `Frontend/index.html` in browser or use static file server
|
||||
- Backend serves frontend at root path when running
|
||||
|
||||
## Architecture
|
||||
|
||||
### Backend Structure (Go + Gin)
|
||||
- **main.go**: Main server setup, database connection, API routes
|
||||
- **models/models.go**: GORM database models for segments, sites, poles, access points
|
||||
- **qc/**: Quality control modules (graph_connect.go, handholes.go, segment_single_span.go)
|
||||
- **db/connect.go**: Database connection utilities
|
||||
|
||||
### Frontend Structure
|
||||
- **index.html**: Main UI with market/zone dropdowns and QC control buttons
|
||||
- **main.js**: JavaScript handling map display (Leaflet), API calls, QC operations
|
||||
- **styles.css**: UI styling
|
||||
|
||||
### Database Integration
|
||||
- Uses PostGIS spatial database with configurable schema/table names
|
||||
- Environment variables in `.env` control database connection and table configuration
|
||||
- Key tables: segments (main data), sites, poles, access_points, map_projects
|
||||
|
||||
### API Endpoints
|
||||
- `/api/markets` - Get available market/project options
|
||||
- `/api/zones` - Get zones for selected market
|
||||
- `/api/segments` - Get segment data as GeoJSON
|
||||
- `/api/sites`, `/api/poles`, `/api/access_points` - Get spatial features
|
||||
- `/api/qc/*` - QC check endpoints (connectivity, single-span, etc.)
|
||||
|
||||
### QC Module Pattern
|
||||
Each QC check follows this pattern:
|
||||
1. Separate Go file in `/qc` directory
|
||||
2. Route registration function called from main.go
|
||||
3. Returns GeoJSON FeatureCollection of affected segments
|
||||
4. Updates `qc_flag` column to mark issues
|
||||
|
||||
### Environment Configuration
|
||||
Backend uses environment variables for database connection and table/column names:
|
||||
- DB_HOST, DB_PORT, DB_USER, DB_PASS, DB_NAME
|
||||
- SCHEMA_NAME, SEGMENT_TABLE, ZONE_COLUMN, MAPID_COLUMN, etc.
|
||||
- SERVER_PORT for web server port
|
||||
|
||||
### Key Technologies
|
||||
- **Backend**: Go 1.24+, Gin web framework, GORM ORM, PostGIS
|
||||
- **Frontend**: Vanilla JavaScript, Leaflet.js for maps, Turf.js for spatial operations
|
||||
- **Database**: PostgreSQL with PostGIS extension
|
||||
3
oldqc/CLAUDE.md:Zone.Identifier
Normal file
3
oldqc/CLAUDE.md:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
95
oldqc/Frontend/index.html
Normal file
95
oldqc/Frontend/index.html
Normal file
@ -0,0 +1,95 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8" />
|
||||
<title>LLD QC Tool</title>
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||
<link rel="preconnect" href="https://fonts.googleapis.com">
|
||||
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
|
||||
<link href="https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600;700&display=swap" rel="stylesheet">
|
||||
<link rel="stylesheet" href="https://unpkg.com/leaflet@1.9.4/dist/leaflet.css" />
|
||||
<link rel="stylesheet" href="/static/styles.css" />
|
||||
</head>
|
||||
<body>
|
||||
<!-- Full-screen map -->
|
||||
<div id="map"></div>
|
||||
|
||||
<!-- Floating header bar -->
|
||||
<div id="header">
|
||||
<div class="header-left">
|
||||
<h1>LLD QC Tool</h1>
|
||||
</div>
|
||||
<div class="header-center">
|
||||
<div class="control-group">
|
||||
<label>Market</label>
|
||||
<select id="marketSelect" class="modern-select">
|
||||
<option value="">Loading...</option>
|
||||
</select>
|
||||
</div>
|
||||
|
||||
<div class="control-group">
|
||||
<label>Zone</label>
|
||||
<select id="zoneSelect" class="modern-select" disabled>
|
||||
<option value="">Select Market First</option>
|
||||
</select>
|
||||
</div>
|
||||
|
||||
<div class="control-group">
|
||||
<label>Segment Filter</label>
|
||||
<select id="segmentTypeFilter" class="modern-select">
|
||||
<option value="">All Types</option>
|
||||
<option value="aerial">Aerial</option>
|
||||
<option value="underground">Underground</option>
|
||||
<option value="proposed">Proposed</option>
|
||||
</select>
|
||||
</div>
|
||||
|
||||
<div class="control-group">
|
||||
<label>QC Operation</label>
|
||||
<select id="qcOperationSelect" class="modern-select">
|
||||
<option value="">Select QC Check...</option>
|
||||
<option value="connectivity">Network Connectivity</option>
|
||||
<option value="single-span">Single Span</option>
|
||||
<option value="aerial-endpoints">Aerial Endpoints</option>
|
||||
<option value="underground-endpoints">Underground Endpoints</option>
|
||||
<option value="zone-containment">Zone Containment</option>
|
||||
<option value="handhole-connectivity">Handhole Connectivity</option>
|
||||
<option value="site-connectivity">Site Connectivity</option>
|
||||
</select>
|
||||
</div>
|
||||
|
||||
<button id="runQCButton" class="primary-button" disabled>
|
||||
<span class="button-text">Run QC</span>
|
||||
<span class="button-loader"></span>
|
||||
</button>
|
||||
|
||||
<button id="clearQCButton" class="secondary-button" style="display:none;">
|
||||
Clear QC Results
|
||||
</button>
|
||||
|
||||
<div class="control-group endpoint-toggle" id="endpointToggleContainer" style="display:none;">
|
||||
<label>
|
||||
<input type="checkbox" id="endpointToggle" checked>
|
||||
Show Endpoints
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Status/Results Panel -->
|
||||
<div id="statusPanel">
|
||||
<div id="status"></div>
|
||||
<div id="qcResult"></div>
|
||||
</div>
|
||||
|
||||
<!-- Loading overlay -->
|
||||
<div id="loadingOverlay" style="display:none;">
|
||||
<div class="loader-spinner"></div>
|
||||
<div class="loader-text">Loading data...</div>
|
||||
</div>
|
||||
|
||||
<script src="https://unpkg.com/leaflet@1.9.4/dist/leaflet.js"></script>
|
||||
<script src="https://unpkg.com/@turf/turf/turf.min.js"></script>
|
||||
<script src="/static/main.js"></script>
|
||||
</body>
|
||||
</html>
|
||||
3
oldqc/Frontend/index.html:Zone.Identifier
Normal file
3
oldqc/Frontend/index.html:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
2013
oldqc/Frontend/main.js
Normal file
2013
oldqc/Frontend/main.js
Normal file
File diff suppressed because it is too large
Load Diff
3
oldqc/Frontend/main.js:Zone.Identifier
Normal file
3
oldqc/Frontend/main.js:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
449
oldqc/Frontend/styles.css
Normal file
449
oldqc/Frontend/styles.css
Normal file
@ -0,0 +1,449 @@
|
||||
/* ========================================
|
||||
GLOBAL RESET & BASE STYLES
|
||||
======================================== */
|
||||
* {
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
html, body {
|
||||
height: 100%;
|
||||
width: 100%;
|
||||
font-family: 'Inter', -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif;
|
||||
color: #1a1a1a;
|
||||
overflow: hidden;
|
||||
background: #0a0e27;
|
||||
}
|
||||
|
||||
/* ========================================
|
||||
FULL-SCREEN MAP
|
||||
======================================== */
|
||||
#map {
|
||||
position: absolute;
|
||||
top: 0;
|
||||
left: 0;
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
z-index: 1;
|
||||
}
|
||||
|
||||
/* ========================================
|
||||
FLOATING HEADER BAR
|
||||
======================================== */
|
||||
#header {
|
||||
position: absolute;
|
||||
top: 20px;
|
||||
left: 50%;
|
||||
transform: translateX(-50%);
|
||||
z-index: 1000;
|
||||
background: rgba(255, 255, 255, 0.98);
|
||||
backdrop-filter: blur(10px);
|
||||
padding: 16px 24px;
|
||||
border-radius: 16px;
|
||||
box-shadow: 0 8px 32px rgba(0, 0, 0, 0.12), 0 2px 8px rgba(0, 0, 0, 0.08);
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 24px;
|
||||
max-width: 95%;
|
||||
transition: all 0.3s cubic-bezier(0.4, 0, 0.2, 1);
|
||||
}
|
||||
|
||||
#header:hover {
|
||||
box-shadow: 0 12px 48px rgba(0, 0, 0, 0.15), 0 4px 12px rgba(0, 0, 0, 0.1);
|
||||
}
|
||||
|
||||
.header-left h1 {
|
||||
font-size: 20px;
|
||||
font-weight: 700;
|
||||
color: #0a0e27;
|
||||
letter-spacing: -0.5px;
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
.header-center {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 16px;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
/* ========================================
|
||||
CONTROL GROUPS & LABELS
|
||||
======================================== */
|
||||
.control-group {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: 6px;
|
||||
}
|
||||
|
||||
.control-group label {
|
||||
font-size: 11px;
|
||||
font-weight: 600;
|
||||
text-transform: uppercase;
|
||||
letter-spacing: 0.5px;
|
||||
color: #6b7280;
|
||||
}
|
||||
|
||||
/* Endpoint toggle checkbox styling */
|
||||
.endpoint-toggle {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
padding: 8px 12px;
|
||||
background: rgba(59, 130, 246, 0.05);
|
||||
border-radius: 8px;
|
||||
border: 2px solid #e5e7eb;
|
||||
transition: all 0.2s ease;
|
||||
}
|
||||
|
||||
.endpoint-toggle:hover {
|
||||
background: rgba(59, 130, 246, 0.1);
|
||||
border-color: #3b82f6;
|
||||
}
|
||||
|
||||
.endpoint-toggle label {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 8px;
|
||||
margin: 0;
|
||||
cursor: pointer;
|
||||
font-size: 13px;
|
||||
font-weight: 600;
|
||||
color: #1a1a1a;
|
||||
text-transform: none;
|
||||
letter-spacing: normal;
|
||||
user-select: none;
|
||||
}
|
||||
|
||||
.endpoint-toggle input[type="checkbox"] {
|
||||
cursor: pointer;
|
||||
width: 18px;
|
||||
height: 18px;
|
||||
margin: 0;
|
||||
accent-color: #3b82f6;
|
||||
}
|
||||
|
||||
/* ========================================
|
||||
MODERN SELECT DROPDOWNS
|
||||
======================================== */
|
||||
.modern-select {
|
||||
padding: 10px 14px;
|
||||
font-size: 14px;
|
||||
font-weight: 500;
|
||||
border: 2px solid #e5e7eb;
|
||||
border-radius: 10px;
|
||||
background: white;
|
||||
color: #1a1a1a;
|
||||
cursor: pointer;
|
||||
transition: all 0.2s ease;
|
||||
min-width: 140px;
|
||||
outline: none;
|
||||
}
|
||||
|
||||
.modern-select:hover:not(:disabled) {
|
||||
border-color: #3b82f6;
|
||||
box-shadow: 0 0 0 3px rgba(59, 130, 246, 0.1);
|
||||
}
|
||||
|
||||
.modern-select:focus {
|
||||
border-color: #3b82f6;
|
||||
box-shadow: 0 0 0 3px rgba(59, 130, 246, 0.2);
|
||||
}
|
||||
|
||||
.modern-select:disabled {
|
||||
opacity: 0.5;
|
||||
cursor: not-allowed;
|
||||
background: #f9fafb;
|
||||
}
|
||||
|
||||
/* ========================================
|
||||
PRIMARY BUTTON
|
||||
======================================== */
|
||||
.primary-button {
|
||||
padding: 10px 24px;
|
||||
font-size: 14px;
|
||||
font-weight: 600;
|
||||
color: white;
|
||||
background: linear-gradient(135deg, #3b82f6 0%, #2563eb 100%);
|
||||
border: none;
|
||||
border-radius: 10px;
|
||||
cursor: pointer;
|
||||
transition: all 0.2s ease;
|
||||
position: relative;
|
||||
overflow: hidden;
|
||||
box-shadow: 0 4px 12px rgba(59, 130, 246, 0.3);
|
||||
}
|
||||
|
||||
.primary-button:hover:not(:disabled) {
|
||||
transform: translateY(-2px);
|
||||
box-shadow: 0 6px 20px rgba(59, 130, 246, 0.4);
|
||||
}
|
||||
|
||||
.primary-button:active:not(:disabled) {
|
||||
transform: translateY(0);
|
||||
}
|
||||
|
||||
.primary-button:disabled {
|
||||
opacity: 0.5;
|
||||
}
|
||||
|
||||
/* ========================================
|
||||
SECONDARY BUTTON
|
||||
======================================== */
|
||||
.secondary-button {
|
||||
padding: 10px 24px;
|
||||
font-size: 14px;
|
||||
font-weight: 600;
|
||||
color: #374151;
|
||||
background: white;
|
||||
border: 2px solid #d1d5db;
|
||||
border-radius: 10px;
|
||||
cursor: pointer;
|
||||
transition: all 0.2s ease;
|
||||
margin-left: 10px;
|
||||
}
|
||||
|
||||
.secondary-button:hover {
|
||||
background: #f9fafb;
|
||||
border-color: #9ca3af;
|
||||
transform: translateY(-1px);
|
||||
box-shadow: 0 4px 12px rgba(0, 0, 0, 0.1);
|
||||
}
|
||||
|
||||
.secondary-button:active {
|
||||
transform: translateY(0);
|
||||
opacity: 0.5;
|
||||
cursor: not-allowed;
|
||||
transform: none;
|
||||
}
|
||||
|
||||
.primary-button.loading .button-text {
|
||||
opacity: 0;
|
||||
}
|
||||
|
||||
.primary-button.loading .button-loader {
|
||||
display: block;
|
||||
}
|
||||
|
||||
.button-loader {
|
||||
display: none;
|
||||
position: absolute;
|
||||
top: 50%;
|
||||
left: 50%;
|
||||
transform: translate(-50%, -50%);
|
||||
width: 16px;
|
||||
height: 16px;
|
||||
border: 2px solid rgba(255, 255, 255, 0.3);
|
||||
border-top-color: white;
|
||||
border-radius: 50%;
|
||||
animation: spin 0.8s linear infinite;
|
||||
}
|
||||
|
||||
@keyframes spin {
|
||||
to { transform: translate(-50%, -50%) rotate(360deg); }
|
||||
}
|
||||
|
||||
/* ========================================
|
||||
STATUS PANEL
|
||||
======================================== */
|
||||
#statusPanel {
|
||||
position: absolute;
|
||||
bottom: 20px;
|
||||
left: 50%;
|
||||
transform: translateX(-50%);
|
||||
z-index: 1000;
|
||||
background: rgba(255, 255, 255, 0.98);
|
||||
backdrop-filter: blur(10px);
|
||||
padding: 16px 24px;
|
||||
border-radius: 12px;
|
||||
box-shadow: 0 8px 32px rgba(0, 0, 0, 0.12);
|
||||
max-width: 600px;
|
||||
min-width: 300px;
|
||||
transition: all 0.3s ease;
|
||||
opacity: 0;
|
||||
pointer-events: none;
|
||||
}
|
||||
|
||||
#statusPanel.visible {
|
||||
opacity: 1;
|
||||
pointer-events: all;
|
||||
}
|
||||
|
||||
#status {
|
||||
font-size: 13px;
|
||||
font-weight: 500;
|
||||
color: #6b7280;
|
||||
margin-bottom: 4px;
|
||||
}
|
||||
|
||||
#qcResult {
|
||||
font-size: 14px;
|
||||
font-weight: 600;
|
||||
line-height: 1.6;
|
||||
}
|
||||
|
||||
/* ========================================
|
||||
LOADING OVERLAY
|
||||
======================================== */
|
||||
#loadingOverlay {
|
||||
position: absolute;
|
||||
top: 0;
|
||||
left: 0;
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
background: rgba(10, 14, 39, 0.8);
|
||||
backdrop-filter: blur(4px);
|
||||
z-index: 9999;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
gap: 16px;
|
||||
}
|
||||
|
||||
.loader-spinner {
|
||||
width: 48px;
|
||||
height: 48px;
|
||||
border: 4px solid rgba(59, 130, 246, 0.2);
|
||||
border-top-color: #3b82f6;
|
||||
border-radius: 50%;
|
||||
animation: spin 1s linear infinite;
|
||||
}
|
||||
|
||||
.loader-text {
|
||||
font-size: 16px;
|
||||
font-weight: 600;
|
||||
color: white;
|
||||
letter-spacing: 0.5px;
|
||||
}
|
||||
|
||||
/* ========================================
|
||||
LEGEND STYLES
|
||||
======================================== */
|
||||
.legend {
|
||||
background: rgba(255, 255, 255, 0.98);
|
||||
backdrop-filter: blur(10px);
|
||||
padding: 16px;
|
||||
border-radius: 12px;
|
||||
box-shadow: 0 8px 32px rgba(0, 0, 0, 0.15);
|
||||
font-family: 'Inter', sans-serif;
|
||||
font-size: 13px;
|
||||
line-height: 1.5;
|
||||
min-width: 220px;
|
||||
transition: all 0.3s ease;
|
||||
}
|
||||
|
||||
.legend:hover {
|
||||
box-shadow: 0 12px 48px rgba(0, 0, 0, 0.2);
|
||||
}
|
||||
|
||||
.legend-title {
|
||||
font-weight: 700;
|
||||
font-size: 14px;
|
||||
margin-bottom: 12px;
|
||||
padding-bottom: 8px;
|
||||
border-bottom: 2px solid #e5e7eb;
|
||||
color: #0a0e27;
|
||||
letter-spacing: -0.3px;
|
||||
}
|
||||
|
||||
.legend-section-title {
|
||||
font-weight: 600;
|
||||
font-size: 11px;
|
||||
text-transform: uppercase;
|
||||
letter-spacing: 0.5px;
|
||||
color: #6b7280;
|
||||
margin-top: 12px;
|
||||
margin-bottom: 6px;
|
||||
}
|
||||
|
||||
.legend-item {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
margin: 8px 0;
|
||||
gap: 10px;
|
||||
transition: all 0.2s ease;
|
||||
padding: 4px;
|
||||
border-radius: 6px;
|
||||
}
|
||||
|
||||
.legend-item-small {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
margin: 4px 0 4px 28px;
|
||||
gap: 8px;
|
||||
font-size: 12px;
|
||||
color: #6b7280;
|
||||
padding: 2px;
|
||||
}
|
||||
|
||||
.legend-item:hover {
|
||||
background: rgba(59, 130, 246, 0.05);
|
||||
}
|
||||
|
||||
.legend-item input[type="checkbox"] {
|
||||
cursor: pointer;
|
||||
width: 18px;
|
||||
height: 18px;
|
||||
margin: 0;
|
||||
accent-color: #3b82f6;
|
||||
}
|
||||
|
||||
.legend-item label {
|
||||
cursor: pointer;
|
||||
margin: 0;
|
||||
font-size: 13px;
|
||||
font-weight: 500;
|
||||
color: #1a1a1a;
|
||||
user-select: none;
|
||||
flex: 1;
|
||||
}
|
||||
|
||||
.legend-item svg {
|
||||
display: block;
|
||||
flex-shrink: 0;
|
||||
}
|
||||
|
||||
/* ========================================
|
||||
CUSTOM MARKER STYLES
|
||||
======================================== */
|
||||
.handhole-marker {
|
||||
background: transparent !important;
|
||||
border: none !important;
|
||||
}
|
||||
|
||||
/* Leaflet popup customization */
|
||||
.leaflet-popup-content-wrapper {
|
||||
border-radius: 12px;
|
||||
box-shadow: 0 8px 32px rgba(0, 0, 0, 0.15);
|
||||
font-family: 'Inter', sans-serif;
|
||||
}
|
||||
|
||||
.leaflet-popup-content {
|
||||
font-size: 13px;
|
||||
line-height: 1.6;
|
||||
margin: 14px 16px;
|
||||
}
|
||||
|
||||
.leaflet-popup-tip {
|
||||
box-shadow: 0 3px 14px rgba(0, 0, 0, 0.15);
|
||||
}
|
||||
|
||||
/* ========================================
|
||||
RESPONSIVE DESIGN
|
||||
======================================== */
|
||||
@media (max-width: 1200px) {
|
||||
#header {
|
||||
flex-direction: column;
|
||||
align-items: flex-start;
|
||||
gap: 16px;
|
||||
top: 10px;
|
||||
padding: 16px;
|
||||
}
|
||||
|
||||
.header-center {
|
||||
width: 100%;
|
||||
justify-content: space-between;
|
||||
}
|
||||
}
|
||||
3
oldqc/Frontend/styles.css:Zone.Identifier
Normal file
3
oldqc/Frontend/styles.css:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
104
oldqc/README.md
Normal file
104
oldqc/README.md
Normal file
@ -0,0 +1,104 @@
|
||||
# Auto-LLD QC
|
||||
|
||||
**Auto-LLD QC** is a web-based spatial quality control (QC) tool designed to automate verification checks for fiber network designs. It connects to a PostGIS database and enables users to filter data by market and zone, then run various spatial QC checks via a clean map interface.
|
||||
|
||||
## 🗺️ What It Does
|
||||
|
||||
- Allows users to select a **market** and **zone** from dropdowns to load relevant data.
|
||||
- Displays map data using **Leaflet.js** and **Turf.js**.
|
||||
- Enables multiple QC operations via user-triggered buttons:
|
||||
- **Graph connectivity** — ensures all segments are spatially connected.
|
||||
- **Single-span check** — validates that each segment has exactly two vertices.
|
||||
- **Underground termination check** — verifies underground segments end at a pole or handhole.
|
||||
- **Site presence and location** — confirms that sites exist and fall within the correct zone.
|
||||
|
||||
## 🧰 Tech Stack
|
||||
|
||||
- **Frontend:** HTML, CSS, JavaScript, [Leaflet.js](https://leafletjs.com/), [Turf.js](https://turfjs.org/)
|
||||
- **Backend:** [Go (Golang)](https://golang.org/) with [Gin](https://github.com/gin-gonic/gin)
|
||||
- **Database:** PostgreSQL with PostGIS extension
|
||||
- **Deployment target:** Web application (self-hosted or internal use)
|
||||
|
||||
---
|
||||
|
||||
## 📂 Project Structure
|
||||
|
||||
```text
|
||||
auto-lld-qc/
|
||||
├── frontend/
|
||||
│ ├── index.html
|
||||
│ ├── main.js
|
||||
│ └── styles.css
|
||||
├── backend/
|
||||
│ ├── main.go
|
||||
│ ├── .env
|
||||
│ ├── go.mod / go.sum
|
||||
│ ├── models/
|
||||
│ │ └── models.go
|
||||
│ └── qc/
|
||||
│ ├── graph_connect.go
|
||||
│ ├── handholes.go
|
||||
│ └── segment_single_span.go
|
||||
⚙️ Setup & Usage
|
||||
Prerequisites
|
||||
Go 1.20+
|
||||
|
||||
PostgreSQL + PostGIS
|
||||
|
||||
Node.js (optional, for frontend bundling or tooling)
|
||||
|
||||
1. Clone the repo
|
||||
bash
|
||||
Copy
|
||||
Edit
|
||||
git clone https://github.com/yourusername/auto-lld-qc.git
|
||||
cd auto-lld-qc
|
||||
2. Configure environment
|
||||
In backend/.env, set the following variables:
|
||||
|
||||
env
|
||||
Copy
|
||||
Edit
|
||||
DB_HOST=your-db-host
|
||||
DB_PORT=5432
|
||||
DB_USER=your-user
|
||||
DB_PASSWORD=your-password
|
||||
DB_NAME=your-db
|
||||
3. Run the backend
|
||||
bash
|
||||
Copy
|
||||
Edit
|
||||
cd backend
|
||||
go run main.go
|
||||
4. Open the frontend
|
||||
Open frontend/index.html in a browser (or serve via a static file server like http-server or Gin).
|
||||
|
||||
✅ QC Features (Complete & Planned)
|
||||
Feature Status
|
||||
Graph connectivity check ✅ Done
|
||||
Segment single-span (2 vertices) ✅ Done
|
||||
Handhole/Pole connection check ✅ Done
|
||||
Site existence + zone inclusion ⏳ Planned
|
||||
Permit validation ⏳ Planned
|
||||
Access point location validation ⏳ Planned
|
||||
|
||||
📌 To-Do
|
||||
Complete remaining QC features (site checks, permits, access points)
|
||||
|
||||
Add UI loading indicators and error handling
|
||||
|
||||
Optionally dockerize for easier deployment
|
||||
|
||||
Write unit tests for backend QC logic
|
||||
|
||||
🧑💻 Developer Notes
|
||||
Uses GORM for Go/Postgres ORM modeling.
|
||||
|
||||
Turf.js handles spatial logic like intersections and geometry analysis on the frontend.
|
||||
|
||||
Each QC module is implemented in a separate .go file under /qc, with a dedicated API route.
|
||||
|
||||
Modular structure allows for easy addition of new QC checks.
|
||||
|
||||
📃 License
|
||||
MIT License
|
||||
3
oldqc/README.md:Zone.Identifier
Normal file
3
oldqc/README.md:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
131
oldqc/SITE_CONNECTIVITY_FEATURE.md
Normal file
131
oldqc/SITE_CONNECTIVITY_FEATURE.md
Normal file
@ -0,0 +1,131 @@
|
||||
# Site Connectivity Feature Documentation
|
||||
|
||||
## Overview
|
||||
The Site Connectivity feature allows users to check if all sites (homes) are connected to the network infrastructure. This feature uses spatial analysis to determine connectivity based on distance thresholds and updates the database with connectivity status for QGIS analysis.
|
||||
|
||||
## Features
|
||||
|
||||
### 1. **Automated Connectivity Check**
|
||||
- Calculates the distance from each site to the nearest network segment
|
||||
- Configurable distance threshold (default: 50 meters)
|
||||
- Uses PostGIS spatial functions for accurate distance calculations
|
||||
|
||||
### 2. **Database Integration**
|
||||
- Adds `connectivity_status` field to sites table (`connected`/`disconnected`)
|
||||
- Adds `connectivity_distance` field with distance to nearest segment in meters
|
||||
- Creates database index for performance optimization
|
||||
|
||||
### 3. **Visual Feedback**
|
||||
- Highlights disconnected sites on the map with red markers
|
||||
- Displays connectivity statistics (total, connected, disconnected, rate)
|
||||
- Popup information showing site details and distance to network
|
||||
|
||||
### 4. **QGIS Integration**
|
||||
- Updated site attributes can be viewed in QGIS
|
||||
- Filter and symbolize sites by connectivity status
|
||||
- Use connectivity_distance field for further analysis
|
||||
|
||||
## Implementation
|
||||
|
||||
### Backend Changes
|
||||
- **`qc/site_connectivity.go`**: New QC module with connectivity analysis
|
||||
- **`models/models.go`**: Updated Sites struct with connectivity fields
|
||||
- **`main.go`**: Registered new route for site connectivity endpoint
|
||||
- **`migrations/add_site_connectivity_fields.sql`**: Database migration script
|
||||
|
||||
### Frontend Changes
|
||||
- **`index.html`**: Added "Check Site Connectivity" button
|
||||
- **`main.js`**: Added JavaScript functionality for connectivity checking
|
||||
|
||||
### API Endpoint
|
||||
```
|
||||
GET /api/qc/site-connectivity?map_id={market_id}&zone={zone}&max_distance={meters}
|
||||
```
|
||||
|
||||
**Parameters:**
|
||||
- `map_id` (required): Market/project ID
|
||||
- `zone` (optional): Zone filter
|
||||
- `max_distance` (optional): Distance threshold in meters (default: 50)
|
||||
|
||||
**Response:**
|
||||
```json
|
||||
{
|
||||
"total_sites": 150,
|
||||
"connected_sites": 145,
|
||||
"disconnected_sites": 5,
|
||||
"connectivity_rate": 96.7,
|
||||
"max_distance_meters": 50,
|
||||
"results": [
|
||||
{
|
||||
"site_id": 123,
|
||||
"site_name": "Site Name",
|
||||
"is_connected": false,
|
||||
"nearest_distance": 75.5,
|
||||
"connectivity_status": "disconnected",
|
||||
"geometry": {...}
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
## Usage Instructions
|
||||
|
||||
### 1. Database Setup
|
||||
Run the migration script to add required columns:
|
||||
```sql
|
||||
-- Execute this in your database
|
||||
\i Backend/migrations/add_site_connectivity_fields.sql
|
||||
```
|
||||
|
||||
### 2. Running the Check
|
||||
1. Start the backend server: `cd Backend && go run main.go`
|
||||
2. Open the web application
|
||||
3. Select a market and optionally a zone
|
||||
4. Click "Check Site Connectivity" button
|
||||
5. View results on the map and in the status display
|
||||
|
||||
### 3. QGIS Analysis
|
||||
1. Connect to your PostGIS database in QGIS
|
||||
2. Load the sites layer
|
||||
3. View the attribute table to see connectivity fields:
|
||||
- `connectivity_status`: "connected" or "disconnected"
|
||||
- `connectivity_distance`: Distance in meters to nearest segment
|
||||
4. Use these fields for filtering, symbology, or further analysis
|
||||
|
||||
## Configuration
|
||||
|
||||
### Distance Threshold
|
||||
The default 50-meter threshold can be adjusted by:
|
||||
- Modifying the frontend JavaScript (`max_distance=50`)
|
||||
- Passing different values via the API parameter
|
||||
- Consider local regulations and technical requirements
|
||||
|
||||
### Performance Considerations
|
||||
- Uses PostGIS spatial indexes for efficient distance calculations
|
||||
- Database index created on connectivity_status for fast filtering
|
||||
- Suitable for datasets with thousands of sites
|
||||
|
||||
## Workflow Integration
|
||||
|
||||
This feature integrates seamlessly with existing QC workflows:
|
||||
1. Load market and zone data
|
||||
2. Run connectivity analysis alongside other QC checks
|
||||
3. Export results to QGIS for detailed analysis and remediation planning
|
||||
4. Update network design based on connectivity gaps
|
||||
|
||||
## Benefits
|
||||
|
||||
1. **Automated Analysis**: Eliminates manual site-by-site connectivity checking
|
||||
2. **Database Persistence**: Results stored for historical analysis and reporting
|
||||
3. **QGIS Integration**: Seamless workflow for GIS analysts
|
||||
4. **Configurable**: Adjustable distance thresholds for different scenarios
|
||||
5. **Visual Feedback**: Clear identification of problem areas on the map
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
Potential improvements that could be added:
|
||||
- Multiple distance thresholds for different site types
|
||||
- Batch connectivity updates for multiple markets
|
||||
- Export functionality for disconnected sites
|
||||
- Integration with network planning tools
|
||||
- Automated report generation
|
||||
3
oldqc/SITE_CONNECTIVITY_FEATURE.md:Zone.Identifier
Normal file
3
oldqc/SITE_CONNECTIVITY_FEATURE.md:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
48
oldqc/UNDERGROUND_ENDPOINTS_README.md
Normal file
48
oldqc/UNDERGROUND_ENDPOINTS_README.md
Normal file
@ -0,0 +1,48 @@
|
||||
# Underground Endpoint QC Feature
|
||||
|
||||
## Overview
|
||||
This feature checks that all underground segments have either a pole or an access point at both endpoints. This ensures proper network connectivity and infrastructure planning.
|
||||
|
||||
## Implementation Details
|
||||
|
||||
### Backend (`/Backend/qc/underground_endpoints.go`)
|
||||
- **Main Function**: `CheckUndergroundEndpoints()` - validates underground segments
|
||||
- **API Endpoints**:
|
||||
- `GET /api/qc/underground-endpoints` - Full QC summary
|
||||
- `GET /api/qc/underground-endpoints/invalid` - Only invalid segments
|
||||
- `POST /api/qc/underground-endpoints/update-flags` - Update QC flags in database
|
||||
|
||||
### Frontend Integration
|
||||
- **Button**: "Underground Endpoint QC" button added to control panel
|
||||
- **Visual Feedback**: Invalid segments highlighted in purple with dashed lines
|
||||
- **Popup Information**: Shows segment ID, endpoint details, and specific issues
|
||||
|
||||
### Key Features
|
||||
1. **Spatial Analysis**: Uses 10-meter buffer (~0.0001 degrees) to find nearby poles/access points
|
||||
2. **Geometry Support**: Handles both LineString and MultiLineString geometries
|
||||
3. **Database Integration**: Updates QC flags for invalid segments
|
||||
4. **Visual Mapping**: Highlights problematic segments on the map
|
||||
5. **Detailed Reporting**: Shows which endpoints are missing and what type of infrastructure is nearby
|
||||
|
||||
### QC Validation Logic
|
||||
For each underground segment:
|
||||
1. Extract start and end coordinates from geometry
|
||||
2. Search for poles and access points within buffer distance
|
||||
3. Check if both endpoints have adjacent infrastructure
|
||||
4. Report specific issues (missing start/end endpoints)
|
||||
5. Update database with QC flag if issues found
|
||||
|
||||
### Usage
|
||||
1. Select a market and zone from dropdowns
|
||||
2. Click "Underground Endpoint QC" button
|
||||
3. View results in the QC result panel
|
||||
4. Invalid segments will be highlighted on the map in purple
|
||||
5. Click on highlighted segments for detailed popup information
|
||||
|
||||
### Database Schema Requirements
|
||||
- Segments table with `segment_type = 'underground'` (case insensitive)
|
||||
- Poles table with point geometries
|
||||
- Access points table with point geometries
|
||||
- QC flag column for marking invalid segments
|
||||
|
||||
This feature follows the same patterns as other QC modules in the application and integrates seamlessly with the existing infrastructure.
|
||||
3
oldqc/UNDERGROUND_ENDPOINTS_README.md:Zone.Identifier
Normal file
3
oldqc/UNDERGROUND_ENDPOINTS_README.md:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
17
oldqc/alter_test_table.sql
Normal file
17
oldqc/alter_test_table.sql
Normal file
@ -0,0 +1,17 @@
|
||||
-- Add missing columns to eli_test.segment2 to match expected schema
|
||||
|
||||
ALTER TABLE eli_test.segment2
|
||||
ADD COLUMN IF NOT EXISTS id_0 INTEGER,
|
||||
ADD COLUMN IF NOT EXISTS mapid INTEGER,
|
||||
ADD COLUMN IF NOT EXISTS segment_type VARCHAR(80),
|
||||
ADD COLUMN IF NOT EXISTS segment_status VARCHAR(80),
|
||||
ADD COLUMN IF NOT EXISTS id INTEGER,
|
||||
ADD COLUMN IF NOT EXISTS protection_status VARCHAR(80),
|
||||
ADD COLUMN IF NOT EXISTS qc_flag VARCHAR(255),
|
||||
ADD COLUMN IF NOT EXISTS group_1 TEXT;
|
||||
|
||||
-- Copy "Group 1" data to group_1 if it exists
|
||||
UPDATE eli_test.segment2 SET group_1 = "Group 1" WHERE "Group 1" IS NOT NULL;
|
||||
|
||||
-- Verify the new structure
|
||||
\d eli_test.segment2
|
||||
3
oldqc/alter_test_table.sql:Zone.Identifier
Normal file
3
oldqc/alter_test_table.sql:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
96
oldqc/create_sites_table_proper.sql
Normal file
96
oldqc/create_sites_table_proper.sql
Normal file
@ -0,0 +1,96 @@
|
||||
-- Recreate sites table with proper column types
|
||||
-- Drop the old table with incorrect column types
|
||||
DROP TABLE IF EXISTS eli_test.sites CASCADE;
|
||||
|
||||
-- Create sites table with proper structure
|
||||
CREATE TABLE eli_test.sites (
|
||||
id_0 SERIAL PRIMARY KEY,
|
||||
gid INTEGER,
|
||||
id INTEGER,
|
||||
"MapProjectID" INTEGER,
|
||||
"Latitude" DOUBLE PRECISION,
|
||||
"Longitude" DOUBLE PRECISION,
|
||||
"Exclude" INTEGER,
|
||||
"Custom" INTEGER,
|
||||
"Color" VARCHAR(50),
|
||||
"Opacity" VARCHAR(50),
|
||||
"ShapeID" VARCHAR(50),
|
||||
"StyleSize" VARCHAR(50),
|
||||
"CreatedBy" INTEGER,
|
||||
"CreatedDate" BIGINT,
|
||||
"ModifiedBy" INTEGER,
|
||||
"ModifiedDate" BIGINT,
|
||||
"HistoryID" INTEGER,
|
||||
"Name" VARCHAR(255),
|
||||
"StatusID" INTEGER,
|
||||
"Group 1" VARCHAR(255),
|
||||
"Group 2" VARCHAR(255),
|
||||
"IconTypeID" INTEGER,
|
||||
"SchoolID" VARCHAR(100),
|
||||
"SiteDemarc" VARCHAR(255),
|
||||
"Address1" VARCHAR(255),
|
||||
"Address2" VARCHAR(255),
|
||||
"City" VARCHAR(100),
|
||||
"State" VARCHAR(50),
|
||||
"Zip" VARCHAR(20),
|
||||
geometry GEOMETRY(Point, 4326)
|
||||
);
|
||||
|
||||
-- Create spatial index on geometry
|
||||
CREATE INDEX sidx_sites_geometry ON eli_test.sites USING GIST(geometry);
|
||||
|
||||
-- Insert test sites with proper data
|
||||
INSERT INTO eli_test.sites (id, "MapProjectID", "Name", "Address1", "City", "State", "Zip", "Group 1", geometry)
|
||||
VALUES
|
||||
-- Sites in Zone_A (correctly within the zone)
|
||||
(1001, 1, 'Home-1001', '123 Market St', 'San Francisco', 'CA', '94102', 'Zone_A', ST_GeomFromText('POINT(-122.4190 37.7755)', 4326)),
|
||||
(1002, 1, 'Home-1002', '456 Mission St', 'San Francisco', 'CA', '94103', 'Zone_A', ST_GeomFromText('POINT(-122.4175 37.7765)', 4326)),
|
||||
(1003, 1, 'Home-1003', '789 Howard St', 'San Francisco', 'CA', '94103', 'Zone_A', ST_GeomFromText('POINT(-122.4160 37.7775)', 4326)),
|
||||
(1004, 1, 'Home-1004', '321 Folsom St', 'San Francisco', 'CA', '94107', 'Zone_A', ST_GeomFromText('POINT(-122.4150 37.7785)', 4326)),
|
||||
(1005, 1, 'Home-1005', '555 Bryant St', 'San Francisco', 'CA', '94107', 'Zone_A', ST_GeomFromText('POINT(-122.4200 37.7710)', 4326)),
|
||||
(1006, 1, 'Home-1006', '888 Harrison St', 'San Francisco', 'CA', '94107', 'Zone_A', ST_GeomFromText('POINT(-122.4205 37.7805)', 4326)),
|
||||
(1007, 1, 'Home-1007', '999 7th St', 'San Francisco', 'CA', '94103', 'Zone_A', ST_GeomFromText('POINT(-122.4135 37.7735)', 4326)),
|
||||
(1008, 1, 'Home-1008', '111 8th St', 'San Francisco', 'CA', '94103', 'Zone_A', ST_GeomFromText('POINT(-122.4125 37.7745)', 4326)),
|
||||
(1009, 1, 'Home-1009', '222 9th St', 'San Francisco', 'CA', '94103', 'Zone_A', ST_GeomFromText('POINT(-122.4330 37.7685)', 4326)),
|
||||
(1010, 1, 'Home-1010', '333 10th St', 'San Francisco', 'CA', '94103', 'Zone_A', ST_GeomFromText('POINT(-122.4325 37.7695)', 4326)),
|
||||
(1011, 1, 'Home-1011', '444 11th St', 'San Francisco', 'CA', '94103', 'Zone_A', ST_GeomFromText('POINT(-122.4105 37.7750)', 4326)),
|
||||
(1012, 1, 'Home-1012', '666 Townsend St', 'San Francisco', 'CA', '94107', 'Zone_A', ST_GeomFromText('POINT(-122.4340 37.7720)', 4326)),
|
||||
|
||||
-- Sites in Zone_B (correctly within the zone)
|
||||
(2001, 1, 'Home-2001', '100 Oak St', 'San Francisco', 'CA', '94102', 'Zone_B', ST_GeomFromText('POINT(-122.4090 37.7855)', 4326)),
|
||||
(2002, 1, 'Home-2002', '200 Fell St', 'San Francisco', 'CA', '94102', 'Zone_B', ST_GeomFromText('POINT(-122.4078 37.7865)', 4326)),
|
||||
(2003, 1, 'Home-2003', '300 Hayes St', 'San Francisco', 'CA', '94102', 'Zone_B', ST_GeomFromText('POINT(-122.4070 37.7875)', 4326)),
|
||||
|
||||
-- Sites in Zone_C (correctly within the zone)
|
||||
(3001, 1, 'Home-3001', '400 Grove St', 'San Francisco', 'CA', '94117', 'Zone_C', ST_GeomFromText('POINT(-122.3990 37.7955)', 4326)),
|
||||
(3002, 1, 'Home-3002', '500 Fulton St', 'San Francisco', 'CA', '94117', 'Zone_C', ST_GeomFromText('POINT(-122.3980 37.7965)', 4326)),
|
||||
|
||||
-- INVALID: Site labeled Zone_A but physically located in Zone_B
|
||||
(1013, 1, 'Home-1013-INVALID', '777 Invalid Location', 'San Francisco', 'CA', '94102', 'Zone_A', ST_GeomFromText('POINT(-122.4080 37.7860)', 4326)),
|
||||
|
||||
-- INVALID: Site labeled Zone_B but physically located in Zone_A
|
||||
(2004, 1, 'Home-2004-INVALID', '888 Wrong Zone', 'San Francisco', 'CA', '94103', 'Zone_B', ST_GeomFromText('POINT(-122.4200 37.7750)', 4326)),
|
||||
|
||||
-- INVALID: Site labeled Zone_C but physically outside all zones
|
||||
(3003, 1, 'Home-3003-INVALID', '999 Outside All Zones', 'San Francisco', 'CA', '94110', 'Zone_C', ST_GeomFromText('POINT(-122.3800 37.7500)', 4326)),
|
||||
|
||||
-- INVALID: Site labeled Zone_A but physically outside all zones
|
||||
(1014, 1, 'Home-1014-INVALID', '1111 Far Away', 'San Francisco', 'CA', '94110', 'Zone_A', ST_GeomFromText('POINT(-122.3700 37.7400)', 4326)),
|
||||
|
||||
-- INVALID: Site with NULL Group 1 (unassigned) but inside Zone_A
|
||||
(1015, 1, 'Home-1015-UNASSIGNED', '1212 Unassigned St', 'San Francisco', 'CA', '94103', NULL, ST_GeomFromText('POINT(-122.4250 37.7800)', 4326)),
|
||||
|
||||
-- Site in Zone_D (correctly in Zone_D)
|
||||
(4001, 1, 'Home-4001', '1313 Empty Zone', 'San Francisco', 'CA', '94110', 'Zone_D', ST_GeomFromText('POINT(-122.3875 37.7625)', 4326));
|
||||
|
||||
-- Verify sites were inserted
|
||||
SELECT COUNT(*) as total_sites FROM eli_test.sites;
|
||||
SELECT id, "Name", "Address1", "Group 1", ST_AsText(geometry) as location FROM eli_test.sites ORDER BY id LIMIT 5;
|
||||
|
||||
-- Summary
|
||||
SELECT
|
||||
"Group 1" as zone,
|
||||
COUNT(*) as site_count
|
||||
FROM eli_test.sites
|
||||
GROUP BY "Group 1"
|
||||
ORDER BY "Group 1";
|
||||
3
oldqc/create_sites_table_proper.sql:Zone.Identifier
Normal file
3
oldqc/create_sites_table_proper.sql:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
90
oldqc/populate_test_data.sql
Normal file
90
oldqc/populate_test_data.sql
Normal file
@ -0,0 +1,90 @@
|
||||
-- Clear existing test data
|
||||
TRUNCATE TABLE eli_test.segment2 RESTART IDENTITY;
|
||||
|
||||
-- Insert comprehensive test data with all required columns
|
||||
INSERT INTO eli_test.segment2
|
||||
(id_0, mapid, segment_type, segment_status, id, protection_status, qc_flag, group_1, type, length, cost, fdh_id, geom)
|
||||
VALUES
|
||||
-- Zone_A segments (mapid = 1001)
|
||||
(1, 1001, 'Aerial', 'Proposed', 101, 'Protected', NULL, 'Zone_A', 'Aerial', 150.5, 1500.00, 1,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4194 37.7749, -122.4184 37.7759)', 4326), 6561)),
|
||||
|
||||
(2, 1001, 'Aerial', 'Proposed', 102, 'Protected', NULL, 'Zone_A', 'Aerial', 145.2, 1450.00, 1,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4184 37.7759, -122.4174 37.7769)', 4326), 6561)),
|
||||
|
||||
(3, 1001, 'Aerial', 'Proposed', 103, 'Protected', NULL, 'Zone_A', 'Aerial', 148.8, 1480.00, 1,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4174 37.7769, -122.4164 37.7779)', 4326), 6561)),
|
||||
|
||||
(4, 1001, 'Underground', 'Proposed', 104, 'Protected', NULL, 'Zone_A', 'Underground', 142.3, 2850.00, 1,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4164 37.7779, -122.4154 37.7789)', 4326), 6561)),
|
||||
|
||||
(5, 1001, 'Underground', 'Proposed', 105, 'Protected', NULL, 'Zone_A', 'Underground', 138.7, 2775.00, 1,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4154 37.7789, -122.4144 37.7799)', 4326), 6561)),
|
||||
|
||||
(6, 1001, 'Aerial', 'Proposed', 106, 'Unprotected', NULL, 'Zone_A', 'Aerial', 155.0, 1550.00, 1,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4200 37.7800, -122.4210 37.7810)', 4326), 6561)),
|
||||
|
||||
-- Long span for single-span testing
|
||||
(7, 1001, 'Aerial', 'Proposed', 107, 'Protected', NULL, 'Zone_A', 'Aerial', 2200.0, 22000.00, 1,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4220 37.7750, -122.4280 37.7760)', 4326), 6561)),
|
||||
|
||||
(8, 1001, 'Underground', 'Proposed', 108, 'Protected', NULL, 'Zone_A', 'Underground', 15.5, 310.00, 1,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4250 37.7820, -122.4249 37.7821)', 4326), 6561)),
|
||||
|
||||
-- Disconnected/isolated segment
|
||||
(9, 1001, 'Aerial', 'Proposed', 109, 'Protected', NULL, 'Zone_A', 'Aerial', 140.0, 1400.00, 1,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.5000 37.8000, -122.4990 37.8010)', 4326), 6561)),
|
||||
|
||||
-- Branching segments
|
||||
(10, 1001, 'Aerial', 'Proposed', 110, 'Protected', NULL, 'Zone_A', 'Aerial', 145.0, 1450.00, 1,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4140 37.7730, -122.4130 37.7740)', 4326), 6561)),
|
||||
|
||||
(11, 1001, 'Aerial', 'Proposed', 111, 'Protected', NULL, 'Zone_A', 'Aerial', 142.0, 1420.00, 1,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4130 37.7740, -122.4120 37.7750)', 4326), 6561)),
|
||||
|
||||
(12, 1001, 'Aerial', 'Proposed', 112, 'Protected', NULL, 'Zone_A', 'Aerial', 144.0, 1440.00, 1,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4130 37.7740, -122.4120 37.7730)', 4326), 6561)),
|
||||
|
||||
(13, 1001, 'Aerial', 'Constructed', 113, 'Protected', NULL, 'Zone_A', 'Aerial', 152.0, 1520.00, 1,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4340 37.7680, -122.4330 37.7690)', 4326), 6561)),
|
||||
|
||||
(14, 1001, 'Underground', 'Design', 114, 'Unprotected', NULL, 'Zone_A', 'Underground', 139.5, 2790.00, 1,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4330 37.7690, -122.4320 37.7700)', 4326), 6561)),
|
||||
|
||||
(15, 1001, 'Aerial', 'Proposed', 115, 'Protected', NULL, 'Zone_A', 'Aerial', 160.0, 1600.00, 1,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4300 37.7700, -122.4100 37.7700)', 4326), 6561)),
|
||||
|
||||
-- Zone_B segments (mapid = 1002)
|
||||
(16, 1002, 'Aerial', 'Existing', 201, 'Protected', NULL, 'Zone_B', 'Aerial', 147.5, 1475.00, 2,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4094 37.7849, -122.4084 37.7859)', 4326), 6561)),
|
||||
|
||||
(17, 1002, 'Underground', 'Existing', 202, 'Protected', NULL, 'Zone_B', 'Underground', 143.2, 2865.00, 2,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4084 37.7859, -122.4074 37.7869)', 4326), 6561)),
|
||||
|
||||
(18, 1002, 'Aerial', 'Proposed', 203, 'Protected', NULL, 'Zone_B', 'Aerial', 149.9, 1499.00, 2,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4074 37.7869, -122.4064 37.7879)', 4326), 6561)),
|
||||
|
||||
-- Zone_C segments (mapid = 1003)
|
||||
(19, 1003, 'Aerial', 'Proposed', 301, 'Protected', NULL, 'Zone_C', 'Aerial', 146.3, 1463.00, 3,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.3994 37.7949, -122.3984 37.7959)', 4326), 6561)),
|
||||
|
||||
(20, 1003, 'Underground', 'Proposed', 302, 'Protected', NULL, 'Zone_C', 'Underground', 141.8, 2836.00, 3,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.3984 37.7959, -122.3974 37.7969)', 4326), 6561));
|
||||
|
||||
-- Add corresponding market entries to map_projects if they don't exist
|
||||
INSERT INTO eli_test.map_projects (mapid, project)
|
||||
SELECT 1001, 'Test Market A'
|
||||
WHERE NOT EXISTS (SELECT 1 FROM eli_test.map_projects WHERE mapid = 1001)
|
||||
UNION ALL
|
||||
SELECT 1002, 'Test Market B'
|
||||
WHERE NOT EXISTS (SELECT 1 FROM eli_test.map_projects WHERE mapid = 1002)
|
||||
UNION ALL
|
||||
SELECT 1003, 'Test Market C'
|
||||
WHERE NOT EXISTS (SELECT 1 FROM eli_test.map_projects WHERE mapid = 1003);
|
||||
|
||||
-- Verify the data
|
||||
SELECT COUNT(*) as total_segments FROM eli_test.segment2;
|
||||
SELECT mapid, group_1, COUNT(*) as segment_count
|
||||
FROM eli_test.segment2
|
||||
GROUP BY mapid, group_1
|
||||
ORDER BY mapid, group_1;
|
||||
3
oldqc/populate_test_data.sql:Zone.Identifier
Normal file
3
oldqc/populate_test_data.sql:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
80
oldqc/test_data.sql
Normal file
80
oldqc/test_data.sql
Normal file
@ -0,0 +1,80 @@
|
||||
-- Test data for eli_test.segment2
|
||||
-- This creates sample fiber network segments with various configurations to test QC features
|
||||
-- Actual columns: gid (auto), type, length, cost, fdh_id, geom (MultiLineString, SRID 6561), "Group 1"
|
||||
|
||||
-- Insert test segments with different characteristics
|
||||
INSERT INTO eli_test.segment2 (type, length, cost, fdh_id, "Group 1", geom)
|
||||
VALUES
|
||||
-- Normal aerial segments in Zone A
|
||||
('Aerial', 150.5, 1500.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4194 37.7749, -122.4184 37.7759)', 4326), 6561)),
|
||||
|
||||
('Aerial', 145.2, 1450.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4184 37.7759, -122.4174 37.7769)', 4326), 6561)),
|
||||
|
||||
('Aerial', 148.8, 1480.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4174 37.7769, -122.4164 37.7779)', 4326), 6561)),
|
||||
|
||||
-- Underground segments in Zone A
|
||||
('Underground', 142.3, 2850.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4164 37.7779, -122.4154 37.7789)', 4326), 6561)),
|
||||
|
||||
('Underground', 138.7, 2775.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4154 37.7789, -122.4144 37.7799)', 4326), 6561)),
|
||||
|
||||
-- More segments in Zone A
|
||||
('Aerial', 155.0, 1550.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4200 37.7800, -122.4210 37.7810)', 4326), 6561)),
|
||||
|
||||
('Aerial', 850.0, 8500.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4220 37.7750, -122.4280 37.7760)', 4326), 6561)),
|
||||
|
||||
-- Segments in Zone B
|
||||
('Aerial', 147.5, 1475.00, 2, 'Zone_B',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4094 37.7849, -122.4084 37.7859)', 4326), 6561)),
|
||||
|
||||
('Underground', 143.2, 2865.00, 2, 'Zone_B',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4084 37.7859, -122.4074 37.7869)', 4326), 6561)),
|
||||
|
||||
('Aerial', 149.9, 1499.00, 2, 'Zone_B',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4074 37.7869, -122.4064 37.7879)', 4326), 6561)),
|
||||
|
||||
-- Long segment for single-span testing
|
||||
('Aerial', 2200.0, 22000.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4300 37.7700, -122.4100 37.7700)', 4326), 6561)),
|
||||
|
||||
-- Very short segment
|
||||
('Underground', 15.5, 310.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4250 37.7820, -122.4249 37.7821)', 4326), 6561)),
|
||||
|
||||
-- Disconnected segment (isolated)
|
||||
('Aerial', 140.0, 1400.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.5000 37.8000, -122.4990 37.8010)', 4326), 6561)),
|
||||
|
||||
-- Multiple segments forming a branch
|
||||
('Aerial', 145.0, 1450.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4140 37.7730, -122.4130 37.7740)', 4326), 6561)),
|
||||
|
||||
('Aerial', 142.0, 1420.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4130 37.7740, -122.4120 37.7750)', 4326), 6561)),
|
||||
|
||||
('Aerial', 144.0, 1440.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4130 37.7740, -122.4120 37.7730)', 4326), 6561)),
|
||||
|
||||
-- Segments in Zone C
|
||||
('Aerial', 146.3, 1463.00, 3, 'Zone_C',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.3994 37.7949, -122.3984 37.7959)', 4326), 6561)),
|
||||
|
||||
('Underground', 141.8, 2836.00, 3, 'Zone_C',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.3984 37.7959, -122.3974 37.7969)', 4326), 6561)),
|
||||
|
||||
-- Additional varied segments
|
||||
('Aerial', 152.0, 1520.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4340 37.7680, -122.4330 37.7690)', 4326), 6561)),
|
||||
|
||||
('Underground', 139.5, 2790.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4330 37.7690, -122.4320 37.7700)', 4326), 6561));
|
||||
|
||||
-- Verify insertion
|
||||
SELECT COUNT(*) as total_segments FROM eli_test.segment2;
|
||||
SELECT "Group 1", COUNT(*) as segment_count FROM eli_test.segment2 GROUP BY "Group 1" ORDER BY "Group 1";
|
||||
3
oldqc/test_data.sql:Zone.Identifier
Normal file
3
oldqc/test_data.sql:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
375
oldqc/test_data_poles_accesspoints.sql
Normal file
375
oldqc/test_data_poles_accesspoints.sql
Normal file
@ -0,0 +1,375 @@
|
||||
-- Test data for eli_test poles and access points
|
||||
-- This creates poles and access points (handholes) for testing QC features
|
||||
-- Coordinates match existing segment endpoints from test_data.sql
|
||||
|
||||
-- ============================================
|
||||
-- CREATE TABLES
|
||||
-- ============================================
|
||||
|
||||
-- Create poles table
|
||||
CREATE TABLE IF NOT EXISTS eli_test.poles (
|
||||
gid SERIAL PRIMARY KEY,
|
||||
id INTEGER,
|
||||
mapprojectid INTEGER,
|
||||
latitude VARCHAR,
|
||||
longitude VARCHAR,
|
||||
custom INTEGER,
|
||||
color VARCHAR,
|
||||
shapeid VARCHAR,
|
||||
stylesize VARCHAR,
|
||||
opacity VARCHAR,
|
||||
createdby INTEGER,
|
||||
createddate INTEGER,
|
||||
modifiedby INTEGER,
|
||||
modifieddate INTEGER,
|
||||
historyid INTEGER,
|
||||
name VARCHAR,
|
||||
tags VARCHAR,
|
||||
group1 VARCHAR,
|
||||
group2 VARCHAR,
|
||||
mrstateid INTEGER,
|
||||
commsmrchoiceid INTEGER,
|
||||
powermrchoiceid VARCHAR,
|
||||
poleheight VARCHAR,
|
||||
attachmentheight VARCHAR,
|
||||
mrnotes VARCHAR,
|
||||
owner VARCHAR,
|
||||
geom GEOMETRY(Point, 6561)
|
||||
);
|
||||
|
||||
-- Create access_points table (handholes)
|
||||
CREATE TABLE IF NOT EXISTS eli_test.access_points (
|
||||
gid SERIAL PRIMARY KEY,
|
||||
id INTEGER,
|
||||
name VARCHAR,
|
||||
mapprojectid INTEGER,
|
||||
latitude VARCHAR,
|
||||
longitude VARCHAR,
|
||||
manufacturer VARCHAR,
|
||||
size VARCHAR,
|
||||
locked INTEGER,
|
||||
description VARCHAR,
|
||||
aka VARCHAR,
|
||||
createdby INTEGER,
|
||||
createddate INTEGER,
|
||||
modifiedby VARCHAR,
|
||||
modifieddate VARCHAR,
|
||||
historyid INTEGER,
|
||||
group1 VARCHAR,
|
||||
group2 VARCHAR,
|
||||
typeid INTEGER,
|
||||
statusid INTEGER,
|
||||
crmvendorid VARCHAR,
|
||||
billdate VARCHAR,
|
||||
geom GEOMETRY(Point, 6561)
|
||||
);
|
||||
|
||||
-- ============================================
|
||||
-- INSERT POLES (for aerial segments)
|
||||
-- ============================================
|
||||
|
||||
-- Poles for the connected aerial segment chain in Zone_A
|
||||
INSERT INTO eli_test.poles (id, mapprojectid, name, owner, poleheight, attachmentheight, group1, geom)
|
||||
VALUES
|
||||
-- Pole at start of first aerial segment
|
||||
(101, 1, 'Pole-101', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4194 37.7749)', 4326), 6561)),
|
||||
|
||||
-- Pole at junction (end of segment 1, start of segment 2)
|
||||
(102, 1, 'Pole-102', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4184 37.7759)', 4326), 6561)),
|
||||
|
||||
-- Pole at junction (end of segment 2, start of segment 3)
|
||||
(103, 1, 'Pole-103', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4174 37.7769)', 4326), 6561)),
|
||||
|
||||
-- Pole at end of third aerial segment (connects to underground)
|
||||
(104, 1, 'Pole-104', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4164 37.7779)', 4326), 6561)),
|
||||
|
||||
-- Poles for another aerial segment
|
||||
(105, 1, 'Pole-105', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4200 37.7800)', 4326), 6561)),
|
||||
|
||||
(106, 1, 'Pole-106', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4210 37.7810)', 4326), 6561)),
|
||||
|
||||
-- Poles for long aerial segment (will be INVALID for single span - too long)
|
||||
(107, 1, 'Pole-107', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4220 37.7750)', 4326), 6561)),
|
||||
|
||||
(108, 1, 'Pole-108', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4280 37.7760)', 4326), 6561)),
|
||||
|
||||
-- Poles for branching segments
|
||||
(109, 1, 'Pole-109', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4140 37.7730)', 4326), 6561)),
|
||||
|
||||
(110, 1, 'Pole-110', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4130 37.7740)', 4326), 6561)),
|
||||
|
||||
(111, 1, 'Pole-111', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4120 37.7750)', 4326), 6561)),
|
||||
|
||||
(112, 1, 'Pole-112', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4120 37.7730)', 4326), 6561)),
|
||||
|
||||
-- Poles for long span test segment
|
||||
(113, 1, 'Pole-113', 'Test Utility', '50', '45', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4300 37.7700)', 4326), 6561)),
|
||||
|
||||
(114, 1, 'Pole-114', 'Test Utility', '50', '45', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4100 37.7700)', 4326), 6561)),
|
||||
|
||||
-- Additional poles for new test segments
|
||||
(115, 1, 'Pole-115', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4340 37.7680)', 4326), 6561)),
|
||||
|
||||
(116, 1, 'Pole-116', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4330 37.7690)', 4326), 6561)),
|
||||
|
||||
-- ONLY ONE POLE for disconnected aerial segment (will be invalid - no pole at start)
|
||||
(117, 1, 'Pole-117', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4990 37.8010)', 4326), 6561));
|
||||
|
||||
-- ============================================
|
||||
-- INSERT ACCESS POINTS / HANDHOLES (for underground segments)
|
||||
-- ============================================
|
||||
|
||||
INSERT INTO eli_test.access_points (id, name, mapprojectid, description, manufacturer, size, typeid, statusid, group1, geom)
|
||||
VALUES
|
||||
-- Access point at junction between aerial and underground (already has pole 104)
|
||||
-- This tests that underground can connect to EITHER pole OR access point
|
||||
(201, 'Handhole-201', 1, 'Transition point from aerial to underground', 'CommScope', '24x36', 1, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4164 37.7779)', 4326), 6561)),
|
||||
|
||||
-- Access point at junction of underground segments
|
||||
(202, 'Handhole-202', 1, 'Underground junction', 'CommScope', '24x36', 1, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4154 37.7789)', 4326), 6561)),
|
||||
|
||||
-- Access point at end of underground segment 2
|
||||
(203, 'Handhole-203', 1, 'Underground termination', 'CommScope', '24x36', 1, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4144 37.7799)', 4326), 6561)),
|
||||
|
||||
-- Access point for short underground segment start
|
||||
(204, 'Handhole-204', 1, 'Short segment start', 'Preformed', '18x24', 1, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4250 37.7820)', 4326), 6561)),
|
||||
|
||||
(205, 'Handhole-205', 1, 'Short segment end', 'Preformed', '18x24', 1, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4249 37.7821)', 4326), 6561)),
|
||||
|
||||
-- Access point at junction between underground and aerial (has pole 116)
|
||||
(206, 'Handhole-206', 1, 'Transition point underground to aerial', 'CommScope', '24x36', 1, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4330 37.7690)', 4326), 6561)),
|
||||
|
||||
-- Access point at end of underground segment
|
||||
(207, 'Handhole-207', 1, 'Underground endpoint', 'CommScope', '30x48', 1, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4320 37.7700)', 4326), 6561)),
|
||||
|
||||
-- Additional access points for new underground segments we'll create
|
||||
(208, 'Handhole-208', 1, 'Underground network point', 'CommScope', '24x36', 1, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4180 37.7720)', 4326), 6561)),
|
||||
|
||||
(209, 'Handhole-209', 1, 'Underground network point', 'CommScope', '24x36', 1, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4170 37.7730)', 4326), 6561)),
|
||||
|
||||
(210, 'Handhole-210', 1, 'Underground vault', 'Oldcastle', '30x48', 2, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4160 37.7740)', 4326), 6561)),
|
||||
|
||||
(211, 'Handhole-211', 1, 'Underground junction', 'CommScope', '24x36', 1, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4150 37.7750)', 4326), 6561));
|
||||
|
||||
-- ============================================
|
||||
-- INSERT ADDITIONAL UNDERGROUND SEGMENTS FOR TESTING
|
||||
-- ============================================
|
||||
|
||||
-- These segments will test various scenarios for the underground endpoints QC
|
||||
|
||||
INSERT INTO eli_test.segment2 (type, length, cost, fdh_id, "Group 1", geom)
|
||||
VALUES
|
||||
-- VALID: Underground segment with access points at both ends (208 -> 209)
|
||||
('Underground', 135.0, 2700.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4180 37.7720, -122.4170 37.7730)', 4326), 6561)),
|
||||
|
||||
-- VALID: Underground segment with access points at both ends (209 -> 210)
|
||||
('Underground', 132.0, 2640.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4170 37.7730, -122.4160 37.7740)', 4326), 6561)),
|
||||
|
||||
-- VALID: Underground segment with access points at both ends (210 -> 211)
|
||||
('Underground', 138.0, 2760.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4160 37.7740, -122.4150 37.7750)', 4326), 6561)),
|
||||
|
||||
-- INVALID: Underground segment with NO endpoints (missing both access points)
|
||||
('Underground', 145.0, 2900.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4100 37.7650, -122.4090 37.7660)', 4326), 6561)),
|
||||
|
||||
-- INVALID: Underground segment with only ONE endpoint (start has access point 211, end missing)
|
||||
('Underground', 142.0, 2840.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4150 37.7750, -122.4140 37.7760)', 4326), 6561)),
|
||||
|
||||
-- INVALID: Underground segment with only ONE endpoint (end point only, start missing)
|
||||
('Underground', 148.0, 2960.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4190 37.7710, -122.4180 37.7720)', 4326), 6561)),
|
||||
|
||||
-- VALID: Long underground segment with endpoints
|
||||
('Underground', 520.0, 10400.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4250 37.7850, -122.4200 37.7850)', 4326), 6561));
|
||||
|
||||
-- Create access points for the last valid underground segment
|
||||
INSERT INTO eli_test.access_points (id, name, mapprojectid, description, manufacturer, size, typeid, statusid, group1, geom)
|
||||
VALUES
|
||||
(212, 'Handhole-212', 1, 'Long underground run start', 'Oldcastle', '36x60', 2, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4250 37.7850)', 4326), 6561)),
|
||||
|
||||
(213, 'Handhole-213', 1, 'Long underground run end', 'Oldcastle', '36x60', 2, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4200 37.7850)', 4326), 6561));
|
||||
|
||||
-- ============================================
|
||||
-- INSERT MULTI-VERTEX AERIAL SEGMENTS FOR SINGLE SPAN QC TESTING
|
||||
-- ============================================
|
||||
|
||||
-- These segments will test the single span QC (should have exactly 2 vertices)
|
||||
|
||||
-- INVALID: Aerial segment with 3 vertices (multi-span)
|
||||
INSERT INTO eli_test.segment2 (type, length, cost, fdh_id, "Group 1", geom)
|
||||
VALUES
|
||||
('Aerial', 290.0, 2900.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4380 37.7620, -122.4370 37.7630, -122.4360 37.7640)', 4326), 6561)),
|
||||
|
||||
-- INVALID: Aerial segment with 4 vertices (multi-span)
|
||||
('Aerial', 435.0, 4350.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4380 37.7650, -122.4370 37.7660, -122.4360 37.7670, -122.4350 37.7680)', 4326), 6561)),
|
||||
|
||||
-- INVALID: Aerial segment with 5 vertices (many spans)
|
||||
('Aerial', 580.0, 5800.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4450 37.7620, -122.4440 37.7630, -122.4430 37.7640, -122.4420 37.7650, -122.4410 37.7660)', 4326), 6561));
|
||||
|
||||
-- Add poles for the multi-vertex segments (at their endpoints only, not mid-points)
|
||||
INSERT INTO eli_test.poles (id, mapprojectid, name, owner, poleheight, attachmentheight, group1, geom)
|
||||
VALUES
|
||||
-- Poles for 3-vertex segment
|
||||
(118, 1, 'Pole-118', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4380 37.7620)', 4326), 6561)),
|
||||
|
||||
(119, 1, 'Pole-119', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4360 37.7640)', 4326), 6561)),
|
||||
|
||||
-- Poles for 4-vertex segment
|
||||
(120, 1, 'Pole-120', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4380 37.7650)', 4326), 6561)),
|
||||
|
||||
(121, 1, 'Pole-121', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4350 37.7680)', 4326), 6561)),
|
||||
|
||||
-- Poles for 5-vertex segment
|
||||
(122, 1, 'Pole-122', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4450 37.7620)', 4326), 6561)),
|
||||
|
||||
(123, 1, 'Pole-123', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4410 37.7660)', 4326), 6561));
|
||||
|
||||
-- ============================================
|
||||
-- DUPLICATE POLES TEST (for checking only ONE pole at each endpoint)
|
||||
-- ============================================
|
||||
|
||||
-- Create an aerial segment with duplicate poles at one endpoint
|
||||
INSERT INTO eli_test.segment2 (type, length, cost, fdh_id, "Group 1", geom)
|
||||
VALUES
|
||||
-- INVALID: Aerial segment where one endpoint has 2 poles
|
||||
('Aerial', 150.0, 1500.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4500 37.7900, -122.4490 37.7910)', 4326), 6561));
|
||||
|
||||
-- Add poles for this segment
|
||||
INSERT INTO eli_test.poles (id, mapprojectid, name, owner, poleheight, attachmentheight, group1, geom)
|
||||
VALUES
|
||||
-- Start point: ONE pole (correct)
|
||||
(124, 1, 'Pole-124', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4500 37.7900)', 4326), 6561)),
|
||||
|
||||
-- End point: TWO poles at same location (INVALID - should only be ONE)
|
||||
(125, 1, 'Pole-125-A', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4490 37.7910)', 4326), 6561)),
|
||||
|
||||
(126, 1, 'Pole-125-B', 'Test Utility', '45', '38', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4490 37.7910)', 4326), 6561));
|
||||
|
||||
-- Create another test segment with THREE poles at one endpoint for extreme case testing
|
||||
INSERT INTO eli_test.segment2 (type, length, cost, fdh_id, "Group 1", geom)
|
||||
VALUES
|
||||
-- INVALID: Aerial segment where one endpoint has 3 poles
|
||||
('Aerial', 155.0, 1550.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4520 37.7920, -122.4510 37.7930)', 4326), 6561));
|
||||
|
||||
-- Add poles for this segment
|
||||
INSERT INTO eli_test.poles (id, mapprojectid, name, owner, poleheight, attachmentheight, group1, geom)
|
||||
VALUES
|
||||
-- Start point: ONE pole (correct)
|
||||
(127, 1, 'Pole-127', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4520 37.7920)', 4326), 6561)),
|
||||
|
||||
-- End point: THREE poles at same location (INVALID - should only be ONE)
|
||||
(128, 1, 'Pole-128-A', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4510 37.7930)', 4326), 6561)),
|
||||
|
||||
(129, 1, 'Pole-128-B', 'Test Utility', '42', '36', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4510 37.7930)', 4326), 6561)),
|
||||
|
||||
(130, 1, 'Pole-128-C', 'Test Utility', '38', '33', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4510 37.7930)', 4326), 6561));
|
||||
|
||||
-- ============================================
|
||||
-- VERIFICATION QUERIES
|
||||
-- ============================================
|
||||
|
||||
-- Verify poles were inserted
|
||||
SELECT COUNT(*) as total_poles FROM eli_test.poles;
|
||||
SELECT id, name, ST_AsText(ST_Transform(geom, 4326)) as location FROM eli_test.poles ORDER BY id;
|
||||
|
||||
-- Verify access points were inserted
|
||||
SELECT COUNT(*) as total_access_points FROM eli_test.access_points;
|
||||
SELECT id, name, ST_AsText(ST_Transform(geom, 4326)) as location FROM eli_test.access_points ORDER BY id;
|
||||
|
||||
-- Verify all segments
|
||||
SELECT COUNT(*) as total_segments FROM eli_test.segment2;
|
||||
SELECT type, COUNT(*) as segment_count FROM eli_test.segment2 GROUP BY type ORDER BY type;
|
||||
|
||||
-- Check for duplicate poles at same location
|
||||
SELECT ST_AsText(ST_Transform(geom, 4326)) as location, COUNT(*) as pole_count
|
||||
FROM eli_test.poles
|
||||
GROUP BY geom
|
||||
HAVING COUNT(*) > 1
|
||||
ORDER BY pole_count DESC;
|
||||
|
||||
-- ============================================
|
||||
-- TEST SCENARIO SUMMARY
|
||||
-- ============================================
|
||||
|
||||
/*
|
||||
SINGLE SPAN QC TEST SCENARIOS:
|
||||
- Valid (2 vertices): Most of the original aerial segments from test_data.sql
|
||||
- Invalid (3 vertices): 1 segment added
|
||||
- Invalid (4 vertices): 1 segment added
|
||||
- Invalid (5 vertices): 1 segment added
|
||||
|
||||
AERIAL ENDPOINT POLE COUNT QC TEST SCENARIOS:
|
||||
- Valid (exactly 1 pole at each endpoint): Most aerial segments
|
||||
- Invalid (0 poles at start endpoint): 1 segment (disconnected segment)
|
||||
- Invalid (2 poles at one endpoint): 1 segment
|
||||
- Invalid (3 poles at one endpoint): 1 segment
|
||||
|
||||
UNDERGROUND ENDPOINTS QC TEST SCENARIOS:
|
||||
- Valid (both endpoints present):
|
||||
* Original 2 underground segments from test_data.sql
|
||||
* 4 new underground segments with proper access points
|
||||
- Invalid (no endpoints): 1 segment
|
||||
- Invalid (only start endpoint): 1 segment
|
||||
- Invalid (only end endpoint): 1 segment
|
||||
|
||||
TOTAL TEST DATA:
|
||||
- Poles: 30 poles total (including 5 duplicates at 2 locations)
|
||||
- Access Points: 13 handholes at underground segment endpoints
|
||||
- Underground Segments: ~10 total (2 original + 7 new)
|
||||
- Aerial Segments: ~22 total (original + 3 multi-vertex + 2 duplicate pole tests)
|
||||
|
||||
All test data uses mapid=1 and Group 1='Zone_A'
|
||||
*/
|
||||
3
oldqc/test_data_poles_accesspoints.sql:Zone.Identifier
Normal file
3
oldqc/test_data_poles_accesspoints.sql:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
54
oldqc/test_data_sites_fixed.sql
Normal file
54
oldqc/test_data_sites_fixed.sql
Normal file
@ -0,0 +1,54 @@
|
||||
-- Fixed sites data matching actual eli_test.sites table structure
|
||||
-- Adds "Group 1" column for zone assignment
|
||||
-- Uses SRID 4326
|
||||
|
||||
-- Add Group 1 column if it doesn't exist
|
||||
ALTER TABLE eli_test.sites ADD COLUMN IF NOT EXISTS "Group 1" varchar(255);
|
||||
|
||||
-- Insert sites with proper zone assignments
|
||||
INSERT INTO eli_test.sites (id, "MapProjectID", "Latitude", "Longitude", "Address1", "Group 1", geometry)
|
||||
VALUES
|
||||
-- Sites in Zone_A (correctly within the zone)
|
||||
(1001, 1, 37.7755, -122.4190, '123 Market St', 'Zone_A', ST_GeomFromText('POINT(-122.4190 37.7755)', 4326)),
|
||||
(1002, 1, 37.7765, -122.4175, '456 Mission St', 'Zone_A', ST_GeomFromText('POINT(-122.4175 37.7765)', 4326)),
|
||||
(1003, 1, 37.7775, -122.4160, '789 Howard St', 'Zone_A', ST_GeomFromText('POINT(-122.4160 37.7775)', 4326)),
|
||||
(1004, 1, 37.7785, -122.4150, '321 Folsom St', 'Zone_A', ST_GeomFromText('POINT(-122.4150 37.7785)', 4326)),
|
||||
(1005, 1, 37.7710, -122.4200, '555 Bryant St', 'Zone_A', ST_GeomFromText('POINT(-122.4200 37.7710)', 4326)),
|
||||
(1006, 1, 37.7805, -122.4205, '888 Harrison St', 'Zone_A', ST_GeomFromText('POINT(-122.4205 37.7805)', 4326)),
|
||||
(1007, 1, 37.7735, -122.4135, '999 7th St', 'Zone_A', ST_GeomFromText('POINT(-122.4135 37.7735)', 4326)),
|
||||
(1008, 1, 37.7745, -122.4125, '111 8th St', 'Zone_A', ST_GeomFromText('POINT(-122.4125 37.7745)', 4326)),
|
||||
(1009, 1, 37.7685, -122.4330, '222 9th St', 'Zone_A', ST_GeomFromText('POINT(-122.4330 37.7685)', 4326)),
|
||||
(1010, 1, 37.7695, -122.4325, '333 10th St', 'Zone_A', ST_GeomFromText('POINT(-122.4325 37.7695)', 4326)),
|
||||
(1011, 1, 37.7750, -122.4105, '444 11th St', 'Zone_A', ST_GeomFromText('POINT(-122.4105 37.7750)', 4326)),
|
||||
(1012, 1, 37.7720, -122.4340, '666 Townsend St', 'Zone_A', ST_GeomFromText('POINT(-122.4340 37.7720)', 4326)),
|
||||
|
||||
-- Sites in Zone_B (correctly within the zone)
|
||||
(2001, 1, 37.7855, -122.4090, '100 Oak St', 'Zone_B', ST_GeomFromText('POINT(-122.4090 37.7855)', 4326)),
|
||||
(2002, 1, 37.7865, -122.4078, '200 Fell St', 'Zone_B', ST_GeomFromText('POINT(-122.4078 37.7865)', 4326)),
|
||||
(2003, 1, 37.7875, -122.4070, '300 Hayes St', 'Zone_B', ST_GeomFromText('POINT(-122.4070 37.7875)', 4326)),
|
||||
|
||||
-- Sites in Zone_C (correctly within the zone)
|
||||
(3001, 1, 37.7955, -122.3990, '400 Grove St', 'Zone_C', ST_GeomFromText('POINT(-122.3990 37.7955)', 4326)),
|
||||
(3002, 1, 37.7965, -122.3980, '500 Fulton St', 'Zone_C', ST_GeomFromText('POINT(-122.3980 37.7965)', 4326)),
|
||||
|
||||
-- INVALID: Site labeled Zone_A but physically located in Zone_B
|
||||
(1013, 1, 37.7860, -122.4080, '777 Invalid Location', 'Zone_A', ST_GeomFromText('POINT(-122.4080 37.7860)', 4326)),
|
||||
|
||||
-- INVALID: Site labeled Zone_B but physically located in Zone_A
|
||||
(2004, 1, 37.7750, -122.4200, '888 Wrong Zone', 'Zone_B', ST_GeomFromText('POINT(-122.4200 37.7750)', 4326)),
|
||||
|
||||
-- INVALID: Site labeled Zone_C but physically outside all zones
|
||||
(3003, 1, 37.7500, -122.3800, '999 Outside All Zones', 'Zone_C', ST_GeomFromText('POINT(-122.3800 37.7500)', 4326)),
|
||||
|
||||
-- INVALID: Site labeled Zone_A but physically outside all zones
|
||||
(1014, 1, 37.7400, -122.3700, '1111 Far Away', 'Zone_A', ST_GeomFromText('POINT(-122.3700 37.7400)', 4326)),
|
||||
|
||||
-- INVALID: Site with NULL Group 1 (unassigned) but inside Zone_A
|
||||
(1015, 1, 37.7800, -122.4250, '1212 Unassigned St', NULL, ST_GeomFromText('POINT(-122.4250 37.7800)', 4326)),
|
||||
|
||||
-- Site in Zone_D (correctly in Zone_D)
|
||||
(4001, 1, 37.7625, -122.3875, '1313 Empty Zone', 'Zone_D', ST_GeomFromText('POINT(-122.3875 37.7625)', 4326));
|
||||
|
||||
-- Verify sites were inserted
|
||||
SELECT COUNT(*) as total_sites FROM eli_test.sites WHERE id >= 1001;
|
||||
SELECT id, "Address1", "Group 1", ST_AsText(geometry) as location FROM eli_test.sites WHERE id >= 1001 ORDER BY id LIMIT 5;
|
||||
3
oldqc/test_data_sites_fixed.sql:Zone.Identifier
Normal file
3
oldqc/test_data_sites_fixed.sql:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
369
oldqc/test_data_zones_sites.sql
Normal file
369
oldqc/test_data_zones_sites.sql
Normal file
@ -0,0 +1,369 @@
|
||||
-- Test data for zones (info layer) and sites (home points)
|
||||
-- This creates zone polygons and site points for testing zone containment QC features
|
||||
-- Coordinates match the existing segment data from test_data.sql
|
||||
|
||||
-- ============================================
|
||||
-- CREATE TABLES
|
||||
-- ============================================
|
||||
|
||||
-- Create info table for zone polygons
|
||||
CREATE TABLE IF NOT EXISTS eli_test.info (
|
||||
id SERIAL PRIMARY KEY,
|
||||
name VARCHAR,
|
||||
tags VARCHAR,
|
||||
description VARCHAR,
|
||||
group_1 VARCHAR,
|
||||
group_2 VARCHAR,
|
||||
mapprojectid INTEGER,
|
||||
geom GEOMETRY(Polygon, 6561)
|
||||
);
|
||||
|
||||
-- Create sites table for home points
|
||||
CREATE TABLE IF NOT EXISTS eli_test.sites (
|
||||
gid SERIAL PRIMARY KEY,
|
||||
id INTEGER,
|
||||
mapprojectid INTEGER,
|
||||
longitude VARCHAR,
|
||||
latitude VARCHAR,
|
||||
exclude INTEGER,
|
||||
custom INTEGER,
|
||||
color VARCHAR,
|
||||
opacity VARCHAR,
|
||||
shapeid VARCHAR,
|
||||
stylesize VARCHAR,
|
||||
createdby INTEGER,
|
||||
createddate INTEGER,
|
||||
modifiedby INTEGER,
|
||||
modifieddate INTEGER,
|
||||
historyid INTEGER,
|
||||
name VARCHAR,
|
||||
statusid INTEGER,
|
||||
group1 VARCHAR,
|
||||
group2 VARCHAR,
|
||||
icontypeid INTEGER,
|
||||
schoolid VARCHAR,
|
||||
sitedemarc VARCHAR,
|
||||
address1 VARCHAR,
|
||||
address2 VARCHAR,
|
||||
city VARCHAR,
|
||||
state VARCHAR,
|
||||
zip VARCHAR,
|
||||
geom GEOMETRY(Point, 6561)
|
||||
);
|
||||
|
||||
-- ============================================
|
||||
-- INSERT ZONE POLYGONS (INFO LAYER)
|
||||
-- ============================================
|
||||
|
||||
-- Zone_A: Large zone covering most of the test segments
|
||||
-- Covers approximately -122.43 to -122.41 longitude, 37.77 to 37.82 latitude
|
||||
INSERT INTO eli_test.info (name, description, group_1, mapprojectid, geom)
|
||||
VALUES
|
||||
('Zone_A', 'Primary test zone covering most network elements', 'Zone_A', 1,
|
||||
ST_Transform(ST_GeomFromText('POLYGON((
|
||||
-122.4350 37.7700,
|
||||
-122.4100 37.7700,
|
||||
-122.4100 37.8200,
|
||||
-122.4350 37.8200,
|
||||
-122.4350 37.7700
|
||||
))', 4326), 6561)),
|
||||
|
||||
-- Zone_B: Smaller zone for Zone_B segments
|
||||
-- Covers approximately -122.41 to -122.406 longitude, 37.784 to 37.789 latitude
|
||||
('Zone_B', 'Secondary test zone for Zone_B network elements', 'Zone_B', 1,
|
||||
ST_Transform(ST_GeomFromText('POLYGON((
|
||||
-122.4100 37.7840,
|
||||
-122.4060 37.7840,
|
||||
-122.4060 37.7890,
|
||||
-122.4100 37.7890,
|
||||
-122.4100 37.7840
|
||||
))', 4326), 6561)),
|
||||
|
||||
-- Zone_C: Another zone for Zone_C segments
|
||||
-- Covers approximately -122.40 to -122.397 longitude, 37.794 to 37.798 latitude
|
||||
('Zone_C', 'Tertiary test zone for Zone_C network elements', 'Zone_C', 1,
|
||||
ST_Transform(ST_GeomFromText('POLYGON((
|
||||
-122.4000 37.7940,
|
||||
-122.3970 37.7940,
|
||||
-122.3970 37.7980,
|
||||
-122.4000 37.7980,
|
||||
-122.4000 37.7940
|
||||
))', 4326), 6561)),
|
||||
|
||||
-- Zone_D: Small zone with no network elements (for testing edge cases)
|
||||
('Zone_D', 'Empty zone with no network elements', 'Zone_D', 1,
|
||||
ST_Transform(ST_GeomFromText('POLYGON((
|
||||
-122.3900 37.7600,
|
||||
-122.3850 37.7600,
|
||||
-122.3850 37.7650,
|
||||
-122.3900 37.7650,
|
||||
-122.3900 37.7600
|
||||
))', 4326), 6561));
|
||||
|
||||
-- ============================================
|
||||
-- INSERT SITES (HOME POINTS)
|
||||
-- ============================================
|
||||
|
||||
-- Sites in Zone_A (correctly within the zone)
|
||||
INSERT INTO eli_test.sites (id, mapprojectid, name, address1, city, state, zip, group1, geom)
|
||||
VALUES
|
||||
-- Site near the aerial segment chain
|
||||
(1001, 1, 'Home-1001', '123 Market St', 'San Francisco', 'CA', '94102', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4190 37.7755)', 4326), 6561)),
|
||||
|
||||
(1002, 1, 'Home-1002', '456 Mission St', 'San Francisco', 'CA', '94103', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4175 37.7765)', 4326), 6561)),
|
||||
|
||||
(1003, 1, 'Home-1003', '789 Howard St', 'San Francisco', 'CA', '94103', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4160 37.7775)', 4326), 6561)),
|
||||
|
||||
-- Site near underground segments
|
||||
(1004, 1, 'Home-1004', '321 Folsom St', 'San Francisco', 'CA', '94107', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4150 37.7785)', 4326), 6561)),
|
||||
|
||||
-- Site near the long aerial segment
|
||||
(1005, 1, 'Home-1005', '555 Bryant St', 'San Francisco', 'CA', '94107', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4200 37.7710)', 4326), 6561)),
|
||||
|
||||
-- Sites near poles
|
||||
(1006, 1, 'Home-1006', '888 Harrison St', 'San Francisco', 'CA', '94107', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4205 37.7805)', 4326), 6561)),
|
||||
|
||||
-- Site near branching segments
|
||||
(1007, 1, 'Home-1007', '999 7th St', 'San Francisco', 'CA', '94103', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4135 37.7735)', 4326), 6561)),
|
||||
|
||||
(1008, 1, 'Home-1008', '111 8th St', 'San Francisco', 'CA', '94103', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4125 37.7745)', 4326), 6561)),
|
||||
|
||||
-- Additional sites in Zone_A
|
||||
(1009, 1, 'Home-1009', '222 9th St', 'San Francisco', 'CA', '94103', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4330 37.7685)', 4326), 6561)),
|
||||
|
||||
(1010, 1, 'Home-1010', '333 10th St', 'San Francisco', 'CA', '94103', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4325 37.7695)', 4326), 6561)),
|
||||
|
||||
-- Sites near edges of Zone_A (still valid)
|
||||
(1011, 1, 'Home-1011', '444 11th St', 'San Francisco', 'CA', '94103', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4105 37.7750)', 4326), 6561)),
|
||||
|
||||
(1012, 1, 'Home-1012', '666 Townsend St', 'San Francisco', 'CA', '94107', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4340 37.7720)', 4326), 6561)),
|
||||
|
||||
-- Sites in Zone_B (correctly within the zone)
|
||||
(2001, 1, 'Home-2001', '100 Oak St', 'San Francisco', 'CA', '94102', 'Zone_B',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4090 37.7855)', 4326), 6561)),
|
||||
|
||||
(2002, 1, 'Home-2002', '200 Fell St', 'San Francisco', 'CA', '94102', 'Zone_B',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4078 37.7865)', 4326), 6561)),
|
||||
|
||||
(2003, 1, 'Home-2003', '300 Hayes St', 'San Francisco', 'CA', '94102', 'Zone_B',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4070 37.7875)', 4326), 6561)),
|
||||
|
||||
-- Sites in Zone_C (correctly within the zone)
|
||||
(3001, 1, 'Home-3001', '400 Grove St', 'San Francisco', 'CA', '94117', 'Zone_C',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.3990 37.7955)', 4326), 6561)),
|
||||
|
||||
(3002, 1, 'Home-3002', '500 Fulton St', 'San Francisco', 'CA', '94117', 'Zone_C',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.3980 37.7965)', 4326), 6561)),
|
||||
|
||||
-- ============================================
|
||||
-- INVALID TEST CASES: Sites OUTSIDE their assigned zones
|
||||
-- ============================================
|
||||
|
||||
-- Site with group1='Zone_A' but physically located in Zone_B
|
||||
(1013, 1, 'Home-1013-INVALID', '777 Invalid Location', 'San Francisco', 'CA', '94102', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4080 37.7860)', 4326), 6561)),
|
||||
|
||||
-- Site with group1='Zone_B' but physically located in Zone_A
|
||||
(2004, 1, 'Home-2004-INVALID', '888 Wrong Zone', 'San Francisco', 'CA', '94103', 'Zone_B',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4200 37.7750)', 4326), 6561)),
|
||||
|
||||
-- Site with group1='Zone_C' but physically located outside all zones
|
||||
(3003, 1, 'Home-3003-INVALID', '999 Outside All Zones', 'San Francisco', 'CA', '94110', 'Zone_C',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.3800 37.7500)', 4326), 6561)),
|
||||
|
||||
-- Site with group1='Zone_A' but physically outside all zones
|
||||
(1014, 1, 'Home-1014-INVALID', '1111 Far Away', 'San Francisco', 'CA', '94110', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.3700 37.7400)', 4326), 6561)),
|
||||
|
||||
-- Site with NULL group1 (unassigned) but inside Zone_A
|
||||
(1015, 1, 'Home-1015-UNASSIGNED', '1212 Unassigned St', 'San Francisco', 'CA', '94103', NULL,
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4250 37.7800)', 4326), 6561)),
|
||||
|
||||
-- Site with group1='Zone_D' (correctly in Zone_D, but Zone_D has no network)
|
||||
(4001, 1, 'Home-4001', '1313 Empty Zone', 'San Francisco', 'CA', '94110', 'Zone_D',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.3875 37.7625)', 4326), 6561));
|
||||
|
||||
-- ============================================
|
||||
-- ADD SOME POLES AND ACCESS POINTS OUTSIDE THEIR ZONES FOR TESTING
|
||||
-- ============================================
|
||||
|
||||
-- Pole with group1='Zone_A' but physically in Zone_B (INVALID)
|
||||
INSERT INTO eli_test.poles (id, mapprojectid, name, owner, poleheight, attachmentheight, group1, geom)
|
||||
VALUES
|
||||
(201, 1, 'Pole-201-INVALID', 'Test Utility', '40', '35', 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4085 37.7850)', 4326), 6561)),
|
||||
|
||||
-- Pole with group1='Zone_B' but physically in Zone_C (INVALID)
|
||||
(202, 1, 'Pole-202-INVALID', 'Test Utility', '40', '35', 'Zone_B',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.3985 37.7960)', 4326), 6561)),
|
||||
|
||||
-- Pole with group1='Zone_C' but outside all zones (INVALID)
|
||||
(203, 1, 'Pole-203-INVALID', 'Test Utility', '40', '35', 'Zone_C',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.3600 37.7300)', 4326), 6561));
|
||||
|
||||
-- Access point with group1='Zone_A' but physically in Zone_B (INVALID)
|
||||
INSERT INTO eli_test.access_points (id, name, mapprojectid, description, manufacturer, size, typeid, statusid, group1, geom)
|
||||
VALUES
|
||||
(301, 'Handhole-301-INVALID', 1, 'Misplaced handhole', 'CommScope', '24x36', 1, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.4075 37.7870)', 4326), 6561)),
|
||||
|
||||
-- Access point with group1='Zone_B' but outside all zones (INVALID)
|
||||
(302, 'Handhole-302-INVALID', 1, 'Outside zone handhole', 'CommScope', '24x36', 1, 1, 'Zone_B',
|
||||
ST_Transform(ST_GeomFromText('POINT(-122.3500 37.7200)', 4326), 6561));
|
||||
|
||||
-- ============================================
|
||||
-- TEST SEGMENTS THAT CROSS ZONE BOUNDARIES
|
||||
-- ============================================
|
||||
|
||||
-- These segments test the scenario where a segment is PARTIALLY in its assigned zone
|
||||
-- According to requirements, these should NOT be flagged as invalid
|
||||
|
||||
INSERT INTO eli_test.segment2 (type, length, cost, fdh_id, "Group 1", geom)
|
||||
VALUES
|
||||
-- VALID: Segment tagged Zone_A that starts in Zone_A and crosses into Zone_B
|
||||
-- Starts at -122.4110 (in Zone_A) and ends at -122.4070 (in Zone_B)
|
||||
('Aerial', 450.0, 4500.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4110 37.7850, -122.4070 37.7860)', 4326), 6561)),
|
||||
|
||||
-- VALID: Another segment tagged Zone_A that crosses into Zone_B
|
||||
('Underground', 380.0, 7600.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4120 37.7855, -122.4065 37.7865)', 4326), 6561)),
|
||||
|
||||
-- INVALID: Segment tagged Zone_A but COMPLETELY in Zone_B (no part in Zone_A)
|
||||
('Aerial', 145.0, 1450.00, 1, 'Zone_A',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4095 37.7855, -122.4085 37.7865)', 4326), 6561)),
|
||||
|
||||
-- INVALID: Segment with NULL/blank "Group 1" (should be flagged)
|
||||
('Aerial', 150.0, 1500.00, 1, NULL,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4200 37.7780, -122.4190 37.7790)', 4326), 6561)),
|
||||
|
||||
-- INVALID: Another segment with NULL/blank "Group 1"
|
||||
('Underground', 140.0, 2800.00, 1, NULL,
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4180 37.7785, -122.4170 37.7795)', 4326), 6561)),
|
||||
|
||||
-- VALID: Segment that crosses from Zone_B into area outside defined zones, but partially in Zone_B
|
||||
('Aerial', 500.0, 5000.00, 2, 'Zone_B',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.4080 37.7860, -122.4050 37.7870)', 4326), 6561)),
|
||||
|
||||
-- INVALID: Segment tagged Zone_C but completely outside all zones
|
||||
('Underground', 200.0, 4000.00, 3, 'Zone_C',
|
||||
ST_Transform(ST_GeomFromText('LINESTRING(-122.3600 37.7300, -122.3580 37.7310)', 4326), 6561));
|
||||
|
||||
-- ============================================
|
||||
-- VERIFICATION QUERIES
|
||||
-- ============================================
|
||||
|
||||
-- Verify zone polygons were inserted
|
||||
SELECT COUNT(*) as total_zones FROM eli_test.info;
|
||||
SELECT id, name, group_1, ST_AsText(ST_Transform(geom, 4326)) as polygon_wkt FROM eli_test.info ORDER BY id;
|
||||
|
||||
-- Verify sites were inserted
|
||||
SELECT COUNT(*) as total_sites FROM eli_test.sites;
|
||||
SELECT id, name, group1, ST_AsText(ST_Transform(geom, 4326)) as location FROM eli_test.sites ORDER BY id;
|
||||
|
||||
-- Check which sites are VALID (inside their assigned zone)
|
||||
SELECT
|
||||
s.id,
|
||||
s.name,
|
||||
s.group1 as assigned_zone,
|
||||
i.group_1 as actual_zone,
|
||||
CASE
|
||||
WHEN s.group1 = i.group_1 THEN 'VALID'
|
||||
ELSE 'INVALID'
|
||||
END as status
|
||||
FROM eli_test.sites s
|
||||
LEFT JOIN eli_test.info i ON ST_Within(s.geom, i.geom)
|
||||
ORDER BY s.id;
|
||||
|
||||
-- Check which poles are VALID (inside their assigned zone)
|
||||
SELECT
|
||||
p.id,
|
||||
p.name,
|
||||
p.group1 as assigned_zone,
|
||||
i.group_1 as actual_zone,
|
||||
CASE
|
||||
WHEN p.group1 = i.group_1 THEN 'VALID'
|
||||
ELSE 'INVALID'
|
||||
END as status
|
||||
FROM eli_test.poles p
|
||||
LEFT JOIN eli_test.info i ON ST_Within(p.geom, i.geom)
|
||||
WHERE p.id >= 201 -- Only check the new test poles we added
|
||||
ORDER BY p.id;
|
||||
|
||||
-- Check which access points are VALID (inside their assigned zone)
|
||||
SELECT
|
||||
ap.id,
|
||||
ap.name,
|
||||
ap.group1 as assigned_zone,
|
||||
i.group_1 as actual_zone,
|
||||
CASE
|
||||
WHEN ap.group1 = i.group_1 THEN 'VALID'
|
||||
ELSE 'INVALID'
|
||||
END as status
|
||||
FROM eli_test.access_points ap
|
||||
LEFT JOIN eli_test.info i ON ST_Within(ap.geom, i.geom)
|
||||
WHERE ap.id >= 301 -- Only check the new test access points we added
|
||||
ORDER BY ap.id;
|
||||
|
||||
-- Count sites by zone assignment
|
||||
SELECT group1, COUNT(*) as site_count
|
||||
FROM eli_test.sites
|
||||
GROUP BY group1
|
||||
ORDER BY group1;
|
||||
|
||||
-- ============================================
|
||||
-- TEST SCENARIO SUMMARY
|
||||
-- ============================================
|
||||
|
||||
/*
|
||||
ZONE POLYGONS:
|
||||
- Zone_A: Large polygon covering most test segments (37.77-37.82, -122.435--122.41)
|
||||
- Zone_B: Smaller polygon for Zone_B segments (37.784-37.789, -122.41--122.406)
|
||||
- Zone_C: Small polygon for Zone_C segments (37.794-37.798, -122.40--122.397)
|
||||
- Zone_D: Empty zone with no network elements (for edge case testing)
|
||||
|
||||
SITES (HOME POINTS):
|
||||
- Valid sites in Zone_A: 12 sites correctly within Zone_A polygon
|
||||
- Valid sites in Zone_B: 3 sites correctly within Zone_B polygon
|
||||
- Valid sites in Zone_C: 2 sites correctly within Zone_C polygon
|
||||
- Valid sites in Zone_D: 1 site correctly within Zone_D polygon
|
||||
- Invalid sites (wrong zone): 4 sites with group1 not matching actual zone location
|
||||
- Unassigned site: 1 site with NULL group1
|
||||
|
||||
TOTAL SITES: 23 sites (18 valid, 4 invalid, 1 unassigned)
|
||||
|
||||
INVALID NETWORK ELEMENTS FOR ZONE QC:
|
||||
- Poles: 3 poles with group1 not matching their zone location
|
||||
- Access Points: 2 access points with group1 not matching their zone location
|
||||
|
||||
SEGMENTS THAT CROSS ZONE BOUNDARIES:
|
||||
- Valid (partially in assigned zone): 3 segments that cross boundaries but have some part in their assigned zone
|
||||
- Invalid (completely outside assigned zone): 2 segments tagged for a zone but completely in a different zone
|
||||
- Invalid (NULL/blank zone): 2 segments with NULL "Group 1" attribute
|
||||
|
||||
QC TEST SCENARIOS THIS ENABLES:
|
||||
1. Verify sites are within their assigned zones
|
||||
2. Verify poles are within their assigned zones
|
||||
3. Verify access points are within their assigned zones
|
||||
4. Verify segments are within their assigned zones (segments have "Group 1" column)
|
||||
5. DO NOT flag segments that are PARTIALLY in their assigned zone (crossing boundaries is OK)
|
||||
6. DO flag segments that are COMPLETELY outside their assigned zone
|
||||
7. DO flag any network element with NULL/blank zone attribute
|
||||
8. Detect network elements outside all zones
|
||||
9. Handle zones with no network elements (Zone_D)
|
||||
10. Detect mismatches between assigned zone and physical location
|
||||
|
||||
All test data uses mapprojectid=1 for consistency
|
||||
*/
|
||||
3
oldqc/test_data_zones_sites.sql:Zone.Identifier
Normal file
3
oldqc/test_data_zones_sites.sql:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
3
oldqc/tmp/build-errors.log:Zone.Identifier
Normal file
3
oldqc/tmp/build-errors.log:Zone.Identifier
Normal file
@ -0,0 +1,3 @@
|
||||
[ZoneTransfer]
|
||||
ZoneId=3
|
||||
ReferrerUrl=C:\Users\AlexanderHall\Downloads\Auto_LLD-QC-main.zip
|
||||
BIN
revisions1/logo.png
Normal file
BIN
revisions1/logo.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 2.2 MiB |
BIN
revisions1/logo.png:Zone.Identifier
Normal file
BIN
revisions1/logo.png:Zone.Identifier
Normal file
Binary file not shown.
BIN
revisions1/newfrontendlayout.png
Normal file
BIN
revisions1/newfrontendlayout.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 874 KiB |
10
revisions1/revisions.txt
Normal file
10
revisions1/revisions.txt
Normal file
@ -0,0 +1,10 @@
|
||||
Requests:
|
||||
|
||||
|
||||
FRONT END:
|
||||
-Add "logo.png" to webpage. Use "newfrontendlayout.png" to see how I wish it to be formatted.
|
||||
|
||||
|
||||
BACK END:
|
||||
-access_points, segments, parcels, cabinet_boundaries, network elements, splicing shapfiles: script looking for "Group_01", but shapefiles will have a field called "Group 1"
|
||||
-cables: I don't get this error. Why is this getting flagged? Here is one of them: 144F/EUR_Z07_DC_001. Make the script accept this. It needs to start with XXXF, three numbers followed by a capital "F".
|
||||
12
revisions2/revisions2.txt
Normal file
12
revisions2/revisions2.txt
Normal file
@ -0,0 +1,12 @@
|
||||
Backend:
|
||||
-add this as next QC step after existing ones, before returning to front end:
|
||||
1. Ensure all sites features are within the correct cabinet boundary. Make sure the 2-digit number in the cabinet_boundaries feature "Name" field matches the 2-digit number in the "Group 1" field for all sites.
|
||||
2. Ensure all access_points features are within the correct cabinet boundary. Make sure the 2-digit number in the cabinet_boundaries feature "Name" field matches the 2-digit number in the "Group 1" field for all access_points.
|
||||
3. Ensure all permits features are within the correct cabinet boundary. Make sure the 2-digit number in the cabinet_boundaries feature "Name" field matches the 2-digit number in the "Group 1" field for all permits features.
|
||||
4. Ensure all splicing features are within the correct cabinet boundary. Make sure the 2-digit number in the cabinet_boundaries feature "Name" field matches the 2-digit number in the "Group 1" field for all splicing features.
|
||||
5. Ensure all network_elements features are within the correct cabinet boundary. Make sure the 2-digit number in the cabinet_boundaries feature "Name" field matches the 2-digit number in the "Group 1" field for all network_elements features.
|
||||
6. Ensure all poles features are within the correct cabinet boundary. Make sure the 2-digit number in the cabinet_boundaries feature "Name" field matches the 2-digit number in the "Group 1" field for all poles features.
|
||||
7. Ensure all segments features are within the correct cabinet boundary. Make sure the 2-digit number in the cabinet_boundaries feature "Name" field matches the 2-digit number in the "Group 1" field for all access_points. EXCEPTION: If a segment crosses between cabinet_boundaries polygons, don't evaluate it.
|
||||
8. Ensure NO features are outside of the entirety of the cabinet_boundaries polgyons layer
|
||||
|
||||
Like with previous QC efforts, return list of specific issues in the .txt file if it does not pass QC
|
||||
BIN
revisions3/celebrate.png
Normal file
BIN
revisions3/celebrate.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 2.0 MiB |
BIN
revisions3/celebrate.png:Zone.Identifier
Normal file
BIN
revisions3/celebrate.png:Zone.Identifier
Normal file
Binary file not shown.
BIN
revisions3/layout.png
Normal file
BIN
revisions3/layout.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 572 KiB |
4
revisions3/revision3.txt
Normal file
4
revisions3/revision3.txt
Normal file
@ -0,0 +1,4 @@
|
||||
PRD:
|
||||
|
||||
Front End Step 1:
|
||||
-IF it passes QC display "celebrate.png" (in this folder) on webpage, in front, large.
|
||||
1
samplefiles/access_points.cpg
Normal file
1
samplefiles/access_points.cpg
Normal file
@ -0,0 +1 @@
|
||||
UTF-8
|
||||
BIN
samplefiles/access_points.dbf
Normal file
BIN
samplefiles/access_points.dbf
Normal file
Binary file not shown.
1
samplefiles/access_points.prj
Normal file
1
samplefiles/access_points.prj
Normal file
@ -0,0 +1 @@
|
||||
GEOGCS["GCS_WGS_1984",DATUM["D_WGS_1984",SPHEROID["WGS_1984",6378137.0,298.257223563]],PRIMEM["Greenwich",0.0],UNIT["Degree",0.0174532925199433]]
|
||||
27
samplefiles/access_points.qmd
Normal file
27
samplefiles/access_points.qmd
Normal file
@ -0,0 +1,27 @@
|
||||
<!DOCTYPE qgis PUBLIC 'http://mrcc.com/qgis.dtd' 'SYSTEM'>
|
||||
<qgis version="3.36.3-Maidenhead">
|
||||
<identifier></identifier>
|
||||
<parentidentifier></parentidentifier>
|
||||
<language></language>
|
||||
<type></type>
|
||||
<title></title>
|
||||
<abstract></abstract>
|
||||
<links/>
|
||||
<dates/>
|
||||
<fees></fees>
|
||||
<encoding></encoding>
|
||||
<crs>
|
||||
<spatialrefsys nativeFormat="Wkt">
|
||||
<wkt></wkt>
|
||||
<proj4></proj4>
|
||||
<srsid>0</srsid>
|
||||
<srid>0</srid>
|
||||
<authid></authid>
|
||||
<description></description>
|
||||
<projectionacronym></projectionacronym>
|
||||
<ellipsoidacronym></ellipsoidacronym>
|
||||
<geographicflag>false</geographicflag>
|
||||
</spatialrefsys>
|
||||
</crs>
|
||||
<extent/>
|
||||
</qgis>
|
||||
BIN
samplefiles/access_points.shp
Normal file
BIN
samplefiles/access_points.shp
Normal file
Binary file not shown.
BIN
samplefiles/access_points.shx
Normal file
BIN
samplefiles/access_points.shx
Normal file
Binary file not shown.
1
samplefiles/cabinet_boundaries.cpg
Normal file
1
samplefiles/cabinet_boundaries.cpg
Normal file
@ -0,0 +1 @@
|
||||
UTF-8
|
||||
BIN
samplefiles/cabinet_boundaries.dbf
Normal file
BIN
samplefiles/cabinet_boundaries.dbf
Normal file
Binary file not shown.
1
samplefiles/cabinet_boundaries.prj
Normal file
1
samplefiles/cabinet_boundaries.prj
Normal file
@ -0,0 +1 @@
|
||||
GEOGCS["GCS_WGS_1984",DATUM["D_WGS_1984",SPHEROID["WGS_1984",6378137.0,298.257223563]],PRIMEM["Greenwich",0.0],UNIT["Degree",0.0174532925199433]]
|
||||
27
samplefiles/cabinet_boundaries.qmd
Normal file
27
samplefiles/cabinet_boundaries.qmd
Normal file
@ -0,0 +1,27 @@
|
||||
<!DOCTYPE qgis PUBLIC 'http://mrcc.com/qgis.dtd' 'SYSTEM'>
|
||||
<qgis version="3.36.3-Maidenhead">
|
||||
<identifier></identifier>
|
||||
<parentidentifier></parentidentifier>
|
||||
<language></language>
|
||||
<type></type>
|
||||
<title></title>
|
||||
<abstract></abstract>
|
||||
<links/>
|
||||
<dates/>
|
||||
<fees></fees>
|
||||
<encoding></encoding>
|
||||
<crs>
|
||||
<spatialrefsys nativeFormat="Wkt">
|
||||
<wkt></wkt>
|
||||
<proj4></proj4>
|
||||
<srsid>0</srsid>
|
||||
<srid>0</srid>
|
||||
<authid></authid>
|
||||
<description></description>
|
||||
<projectionacronym></projectionacronym>
|
||||
<ellipsoidacronym></ellipsoidacronym>
|
||||
<geographicflag>false</geographicflag>
|
||||
</spatialrefsys>
|
||||
</crs>
|
||||
<extent/>
|
||||
</qgis>
|
||||
BIN
samplefiles/cabinet_boundaries.shp
Normal file
BIN
samplefiles/cabinet_boundaries.shp
Normal file
Binary file not shown.
BIN
samplefiles/cabinet_boundaries.shx
Normal file
BIN
samplefiles/cabinet_boundaries.shx
Normal file
Binary file not shown.
1
samplefiles/cables.cpg
Normal file
1
samplefiles/cables.cpg
Normal file
@ -0,0 +1 @@
|
||||
UTF-8
|
||||
BIN
samplefiles/cables.dbf
Normal file
BIN
samplefiles/cables.dbf
Normal file
Binary file not shown.
1
samplefiles/cables.prj
Normal file
1
samplefiles/cables.prj
Normal file
@ -0,0 +1 @@
|
||||
GEOGCS["GCS_WGS_1984",DATUM["D_WGS_1984",SPHEROID["WGS_1984",6378137.0,298.257223563]],PRIMEM["Greenwich",0.0],UNIT["Degree",0.0174532925199433]]
|
||||
27
samplefiles/cables.qmd
Normal file
27
samplefiles/cables.qmd
Normal file
@ -0,0 +1,27 @@
|
||||
<!DOCTYPE qgis PUBLIC 'http://mrcc.com/qgis.dtd' 'SYSTEM'>
|
||||
<qgis version="3.36.3-Maidenhead">
|
||||
<identifier></identifier>
|
||||
<parentidentifier></parentidentifier>
|
||||
<language></language>
|
||||
<type></type>
|
||||
<title></title>
|
||||
<abstract></abstract>
|
||||
<links/>
|
||||
<dates/>
|
||||
<fees></fees>
|
||||
<encoding></encoding>
|
||||
<crs>
|
||||
<spatialrefsys nativeFormat="Wkt">
|
||||
<wkt></wkt>
|
||||
<proj4></proj4>
|
||||
<srsid>0</srsid>
|
||||
<srid>0</srid>
|
||||
<authid></authid>
|
||||
<description></description>
|
||||
<projectionacronym></projectionacronym>
|
||||
<ellipsoidacronym></ellipsoidacronym>
|
||||
<geographicflag>false</geographicflag>
|
||||
</spatialrefsys>
|
||||
</crs>
|
||||
<extent/>
|
||||
</qgis>
|
||||
BIN
samplefiles/cables.shp
Normal file
BIN
samplefiles/cables.shp
Normal file
Binary file not shown.
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user