Skip to content

Instantly share code, notes, and snippets.

@sfc-gh-vsekar
Created September 27, 2025 17:49
Show Gist options
  • Select an option

  • Save sfc-gh-vsekar/59bf11691d279739f7b2f6fb26a660bf to your computer and use it in GitHub Desktop.

Select an option

Save sfc-gh-vsekar/59bf11691d279739f7b2f6fb26a660bf to your computer and use it in GitHub Desktop.
Series of Snowflake notebook demonstrating using ArcGIS python SDK natively in Snowflake
Display the source blob
Display the rendered blob
Raw
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Display the source blob
Display the rendered blob
Raw
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Display the source blob
Display the rendered blob
Raw
{
"metadata": {
"language_info": {
"name": "python"
},
"lastEditStatus": {
"notebookId": "fztes3b6kddjxcimln",
"lastEditTime": 1758994924641
}
},
"nbformat_minor": 5,
"nbformat": 4,
"cells": [
{
"cell_type": "markdown",
"id": "3fc5e057-92aa-4e27-856a-9d4572e460c5",
"metadata": {
"collapsed": false,
"name": "cell1"
},
"source": "# ArcGIS Closest Facility Analysis with Snowflake Stored Procedures\n\n## ๐ŸŽฏ Executive Summary\n\nThis notebook demonstrates a revolutionary approach to field service optimization by integrating **ArcGIS Network Analysis** capabilities directly within **Snowflake** using stored procedures. This implementation eliminates the need for separate processing servers and prevents data exfiltration, while enabling sophisticated spatial analysis and automated web map publishing.\n\nArcGIS Find Closest Facilities Guide: https://developers.arcgis.com/python/latest/guide/part4-find-closest-facilities/\n\n## ๐Ÿ—๏ธ Architecture Overview\n\n### Traditional vs. Snowpark Approach\n\n**Traditional Architecture Challenges:**\n- Requires separate application servers for spatial processing\n- Data must be extracted from Snowflake for analysis\n- Complex data synchronization between systems\n- Security concerns with data leaving the enterprise boundary\n- Additional infrastructure costs and maintenance overhead\n\n**Snowpark-Powered Solution:**\n- **No intermediary external data movement**: Complete spatial analysis originating from Snowflake to ArcGIS\n- **Serverless Architecture**: No need for dedicated spatial processing infrastructure\n- **Native Integration**: Direct access to ArcGIS services from Snowflake stored procedures\n- **Automated Publishing**: Results automatically published to ArcGIS Online/Portal\n\n### Key Innovation Benefits\n\n1. **๐Ÿ’ฐ Cost Efficiency**: Eliminates need for separate spatial processing servers\n3. **โšก Performance**: Reduced latency by eliminating data transfer overhead\n4. **๐Ÿ”„ Simplified Architecture**: Single platform for data storage and spatial analysis\n5. **๐Ÿ“Š Real-time Publishing**: Automatic creation of interactive web maps and feature services\n\n## ๐ŸŽฏ Use Cases and Applications\n\n### Primary Use Cases\n- **Emergency Response Dispatch**: Optimal assignment of first responders to incidents\n- **Field Service Management**: Technician routing and work order optimization\n- **Delivery Route Planning**: Last-mile delivery optimization\n- **Facility Location Analysis**: Strategic placement of service centers\n- **Resource Allocation**: Dynamic assignment of mobile resources\n\n### Industry Applications\n- **Healthcare**: Ambulance dispatch and mobile health services\n- **Utilities**: Field crew dispatch for maintenance and repairs\n- **Telecommunications**: Network maintenance and installation crews\n- **Public Safety**: Police, fire, and emergency medical services\n- **Logistics**: Delivery and pickup route optimization\n\n## ๐Ÿ”ง Technical Architecture\n\n### Core Components\n\n1. **Snowflake Snowpark**: Python runtime environment within Snowflake\n2. **ArcGIS API for Python**: Spatial analysis and web services integration\n3. **External Access Integrations**: Secure connectivity to ArcGIS services\n4. **Stored Procedures**: Encapsulated spatial processing logic\n5. **Feature Publishing**: Automated web map and service creation\n\n### Data Flow Architecture\n\n```\nโ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”\nโ”‚ Source Data โ”‚โ”€โ”€โ”€โ–ถโ”‚ Snowflake โ”‚โ”€โ”€โ”€โ–ถโ”‚ ArcGIS Online โ”‚\nโ”‚ (Technicians, โ”‚ โ”‚ Stored Proc โ”‚ โ”‚ Web Maps & โ”‚\nโ”‚ Incidents) โ”‚ โ”‚ Spatial Analysisโ”‚ โ”‚ Feature Layers โ”‚\nโ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜\n โ”‚\n โ–ผ\n โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”\n โ”‚ Result Tables โ”‚\n โ”‚ (Routes, Assign-โ”‚\n โ”‚ ments, KPIs) โ”‚\n โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜\n```\n\n## ๐Ÿ“‹ Prerequisites and Requirements\n\n### Snowflake Requirements\n- Snowflake account with Snowpark enabled\n- External access integration capabilities\n- Appropriate compute warehouse for spatial processing\n\n### ArcGIS Requirements\n- ArcGIS Online or Portal for ArcGIS account\n- Network Analysis service access\n- Valid user credentials with publishing permissions\n\n### Python Dependencies\n- `arcgis` - ArcGIS API for Python\n- `geopandas` - Geospatial data manipulation\n- `shapely` - Geometric operations\n- `pandas` - Data manipulation and analysis\n\n---"
},
{
"cell_type": "markdown",
"id": "3010848a-81ee-421f-9a2d-07eedd495fe5",
"metadata": {
"name": "cell2",
"collapsed": false
},
"source": "## 1. ๐Ÿ”ง Environment Setup and Configuration\n\nThis section initializes the Snowflake Snowpark session and configures the connection parameters for ArcGIS services. The configuration includes:\n\n### Key Configuration Elements\n\n- **Development Role**: Defines the Snowflake role for development operations\n- **ArcGIS Portal URL**: Target ArcGIS Online or Portal for ArcGIS instance\n- **Authentication Credentials**: User credentials for ArcGIS service access\n- **Session Context**: Active Snowpark session for data operations\n\n**Note:**\n- add package :\n - python-dotenv\n- create a '.env' and configure the following keys and value:\n\n```\n# session role\ndev_role = 'public'\n#\n# ArcGIS Portal configuration\narcgis_portal_url = 'ABCD.arcgis.com'\n#\n# ArcGIS authentication credentials\narcgis_portal_user_name = '---'\narcgis_portal_password = '---'"
},
{
"cell_type": "code",
"execution_count": null,
"id": "486b37a7-57a2-422a-8452-b6d9cb76618b",
"metadata": {
"codeCollapsed": false,
"collapsed": false,
"language": "python",
"name": "initialization",
"vscode": {
"languageId": "plaintext"
}
},
"outputs": [],
"source": "# ==========================================\n# LIBRARY IMPORTS AND CONFIGURATION\n# ==========================================\n\n# Core data processing and Snowflake libraries\nimport streamlit as st\nimport pandas as pd\nimport os\nimport snowflake.snowpark.types as T\nimport snowflake.snowpark.functions as F\nfrom snowflake.snowpark import Session\n\nfrom pathlib import Path\n# Check if the file exists\nif Path('.env').is_file():\n from dotenv import load_dotenv\n load_dotenv() # take environment variables\n\n# ==========================================\n# ENVIRONMENT CONFIGURATION\n# ==========================================\n\n# session role\ndev_role = os.getenv('dev_role','public')\n\n# ArcGIS Portal configuration\narcgis_portal_url = os.getenv('arcgis_portal_url','abc.maps.arcgis.com')\n\n# ArcGIS authentication credentials\narcgis_portal_user_name = os.getenv('arcgis_portal_user_name','john.doe')\narcgis_portal_password = os.getenv('arcgis_portal_password','blah-blah')\n\nprint(f\"๐Ÿ”ง Configuration loaded:\")\nprint(f\" ๐ŸŒ ArcGIS Portal: {arcgis_portal_url}\")\n\n# ==========================================\n# SESSION INITIALIZATION\n# ==========================================\n\n# Initialize Snowpark session\n# This gets the active session from the notebook environment\nsp_session = get_active_session()\nsp_session.use_role(dev_role)\n\n# Display session context for verification\n_spdf = sp_session.sql(\"\"\"\n SELECT \n current_role() as CURRENT_ROLE,\n current_warehouse() as CURRENT_WAREHOUSE,\n current_database() as CURRENT_DATABASE,\n current_schema() as CURRENT_SCHEMA,\n current_user() as CURRENT_USER\n\"\"\")\n_spdf"
},
{
"cell_type": "markdown",
"id": "419021c3-bd1f-4283-a6a1-f20b9e90478a",
"metadata": {
"name": "cell5",
"collapsed": false
},
"source": "## 2. ๐Ÿ—„๏ธ Secure Storage Infrastructure\n\n### Stage Creation for Spatial Processing Artifacts\n\nThis section creates a secure Snowflake stage to store Python scripts, libraries, and other components required for spatial processing. The stage serves as a secure repository for:\n\n- **Python Scripts**: Custom spatial processing logic\n- **Library Dependencies**: Required Python packages and modules \n- **Configuration Files**: Processing parameters and settings\n- **Temporary Artifacts**: Intermediate processing results"
},
{
"cell_type": "code",
"execution_count": null,
"id": "92bbdb7c-d978-4a54-8950-0f2c726ec4f6",
"metadata": {
"codeCollapsed": false,
"collapsed": false,
"language": "sql",
"name": "define_stage"
},
"outputs": [],
"source": [
"-- NOTE: Please switch this to SQL cell.\n",
"\n",
"-- Switch to the development role for stage creation\n",
"USE ROLE {{dev_role}};\n",
"\n",
"-- Create secure library stage for artifacts\n",
"-- This stage will store Python scripts, libraries, and other components\n",
"CREATE OR REPLACE STAGE arcgis_lib_stg\n",
" DIRECTORY = (ENABLE = TRUE)\n",
" ENCRYPTION = (TYPE = 'SNOWFLAKE_SSE')\n",
" COMMENT = 'Secure stage for ArcGIS spatial processing artifacts, Python scripts, and dependencies';\n",
"\n",
"-- Verify stage creation\n",
"SHOW STAGES LIKE 'arcgis_lib_stg';"
]
},
{
"cell_type": "markdown",
"id": "fedec4da-890a-48bd-8abe-70963314eb6c",
"metadata": {
"collapsed": false,
"name": "cell4"
},
"source": "## 3. ๐ŸŒ External Access Integrations\n\nConfigure secure external access integrations to allow Snowflake to communicate with external services. This section sets up network rules, secrets, integrations, and utility functions for PyPI and ArcGIS services.\n\n### Integration Overview\n\n**ArcGIS Integration**: Enables secure communication with ArcGIS services\n- Network access to ArcGIS Online/Portal endpoints\n- Secure credential storage using Snowflake secrets\n- Support for routing and spatial analysis services\n\n### Security Architecture\n\n1. **Network Rules**: Define allowed external endpoints and protocols\n2. **Secrets**: Securely store authentication credentials\n3. **External Access Integrations**: Combine network rules and secrets for controlled access\n4. **Role-based Permissions**: Grant access only to authorized roles\n\n### 3.1 ๐Ÿ—บ๏ธ ArcGIS External Access Integration\n\nConfigure secure external access to ArcGIS services, including encrypted credential storage and network rules. This enables the spatial processing workloads to access ArcGIS routing and analysis services."
},
{
"cell_type": "code",
"execution_count": null,
"id": "681d9861-5491-45e9-bfbb-7cebd98eac1a",
"metadata": {
"codeCollapsed": false,
"collapsed": false,
"language": "sql",
"name": "create_eai_arcgis"
},
"outputs": [],
"source": [
"-- NOTE: Please switch this to SQL cell.\n",
"\n",
"USE ROLE {{dev_role}};\n",
"\n",
"CREATE OR REPLACE SECRET arcgis_api_key\n",
" TYPE = PASSWORD\n",
" USERNAME = '{{arcgis_portal_user_name}}'\n",
" PASSWORD = '{{arcgis_portal_password}}'\n",
" COMMENT = 'API key used for connecting to ArcGIS Portal';\n",
"\n",
"GRANT USAGE ON SECRET arcgis_api_key\n",
" TO ROLE {{dev_role}};\n",
"\n",
"CREATE OR REPLACE NETWORK RULE nw_esri_api\n",
" MODE = EGRESS\n",
" TYPE = HOST_PORT\n",
" VALUE_LIST = ('*.arcgis.com', '{{arcgis_portal_url}}')\n",
" COMMENT = 'Network rule for ArcGIS API access';\n",
"\n",
"USE ROLE ACCOUNTADMIN;\n",
"\n",
"CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION eai_esri_api\n",
" ALLOWED_NETWORK_RULES = (nw_esri_api)\n",
" ALLOWED_AUTHENTICATION_SECRETS = (arcgis_api_key)\n",
" ENABLED = TRUE\n",
" COMMENT = 'External access integration for ArcGIS services';\n",
"\n",
"GRANT USAGE ON INTEGRATION eai_esri_api\n",
" TO ROLE {{dev_role}};\n",
"\n",
"USE ROLE {{dev_role}};\n"
]
},
{
"cell_type": "markdown",
"id": "616024a5-9ed3-432a-a28a-e2b10cf619aa",
"metadata": {
"name": "cell8",
"collapsed": false
},
"source": "## 4. ๐Ÿ“Š Sample Data Loading and Preparation\n\nThis section loads sample field service data into Snowflake tables. The demo uses two primary datasets:\n\n### 4.1 ๐Ÿ‘ท Field Technician Data\n\nContains location and availability information for field service technicians:\n- **Technician ID**: Unique identifier for each technician\n- **Location Coordinates**: Latitude/longitude of current technician position\n- **Address Information**: Human-readable address for context\n- **Availability Status**: Current work status and capacity\n"
},
{
"cell_type": "code",
"id": "b8350ee6-3f61-4e80-a110-4a1d6539d060",
"metadata": {
"language": "sql",
"name": "load_data_field_technician"
},
"outputs": [],
"source": "CREATE OR REPLACE TRANSIENT TABLE field_technician (\n FIELD_TECHNICIAN_ID VARCHAR(50),\n LONGITUDE FLOAT,\n LATITUDE FLOAT,\n FIELD_TECHNICIAN_ADDRESS VARCHAR(200)\n);\n\nINSERT INTO field_technician VALUES\n('a70b', -97.6911692, 30.3391586, '1030 Norwood Park Austin TX 78753'),\n('8f2d', -97.7092389, 30.3073893, '1200 Barbara Jordan Austin TX 78723'),\n('b14c', -97.7993718, 30.4794069, '11301 Lakeline Austin TX 78613'),\n('8e47', -97.7409877, 30.3647362, '8000 Shoal Creek Austin TX 78757'),\n('848f', -97.8239203, 30.232782, '5017 Highway 290 Austin TX 78749'),\n('9772', -97.742009, 30.3916, '9700 Capital Of Texas Austin TX 78759'),\n('94f8', -97.7335442, 30.3554777, '2525 Anderson Austin TX 78757'),\n('b280', -97.7965281, 30.4760066, '10900 Lakeline Mall Austin TX 78613'),\n('a08e', -97.8239203, 30.232782, '5017 Highway 290 Austin TX 78749'),\n('a74e', -97.7919894, 30.1659125, '9300 Interstate 35 Austin TX 78748')\n;",
"execution_count": null
},
{
"cell_type": "markdown",
"id": "ffe4d064-689e-44c7-b8b1-fe39f2643279",
"metadata": {
"name": "cell7",
"collapsed": false
},
"source": "### 4.2 ๐Ÿšจ Incident Data Loading\n\nLoad service incident location and details data from the compressed parquet file. This data represents service requests that need to be assigned to technicians."
},
{
"cell_type": "code",
"id": "f0a374d5-58db-4f8a-bb02-3dad0f9cc376",
"metadata": {
"language": "sql",
"name": "load_data_incident",
"codeCollapsed": false
},
"outputs": [],
"source": "CREATE OR REPLACE TRANSIENT TABLE field_incidents (\n INCIDENT_ID VARCHAR(50),\n LONGITUDE FLOAT,\n LATITUDE FLOAT,\n INCIDENT_ADDRESS VARCHAR(200),\n INCIDENT_DESCRIPTION VARCHAR(500)\n);\n\nINSERT INTO field_incidents VALUES\n('b414', -97.6881817, 30.3768186, '10804 Motheral Austin TX 78753', 'UPS for network rack indicating battery fault.'),\n('9b32', -97.87617, 30.23581, '7010 Highway 71 Austin TX 78735', 'Backbone link utilization at 95% causing latency spikes.'),\n('b34e', -97.760755, 30.239522, '2529 1st Austin TX 78704', 'Customer reports slow website loading times but speed test is fine.'),\n('8dcc', -97.8647087, 30.1999503, '5011 Slaughter Austin TX 78749', 'Buildings internal network switches losing configuration.'),\n('9283', -97.7593325, 30.337858, '5608 Parkcrest Austin TX 78731', 'Buildings internal network switches losing configuration.'),\n('a650', -97.748183, 30.269608, '520 6th Austin TX 78701', 'Residential voice service has one-way audio.'),\n('8b53', -97.688077, 30.320647, '6825 Highway 290 Austin TX 78723', 'Multiple ONT devices offline in a specific subdivision.'),\n('87dd', -97.7776059, 30.4444113, '13170 Pond Springs Austin TX 78729', 'Mobile hotspot device overheating and shutting down.'),\n('b7a2', -97.6922448, 30.3136545, '1701 Briarcliff Austin TX 78723', 'DSLAM port showing down status for multiple subscribers.'),\n('9fce', -97.7270903, 30.3764049, '9233 Waterford Centre Austin TX 78758', 'Business customer reports degraded voice quality on SIP trunk.')\n ",
"execution_count": null
},
{
"cell_type": "markdown",
"id": "a5a9bc8a-a37e-4c90-a881-95f02ae898fe",
"metadata": {
"name": "cell10",
"collapsed": false
},
"source": "## 5. ๐Ÿ”ง Core Implementation: Spatial Analysis Stored Procedure\n\nThis section contains the complete implementation of the closest facility analysis using ArcGIS Network Analysis within a Snowflake stored procedure.\n\n### 5.1 ๐Ÿ“‹ Implementation Overview\n\nThe stored procedure implements a comprehensive field service optimization workflow:\n\n1. **Authentication & Connection**: Secure connection to ArcGIS services using stored credentials\n2. **Data Loading**: Retrieval of technician and incident data from Snowflake tables\n3. **Spatial Analysis**: ArcGIS Network Analysis for optimal technician-incident assignments\n4. **Route Calculation**: Detailed routing with travel time and distance metrics\n5. **Result Storage**: Processed results stored back in Snowflake tables\n6. **Web Publishing**: Automatic creation of interactive maps and feature services\n\n### 5.2 ๐Ÿ—๏ธ Architecture Components\n\n#### ArcGISProcessorBase Class\n- **Purpose**: Base class providing common ArcGIS connectivity functionality\n- **Features**: Credential management, secure authentication, connection handling\n- **Security**: Leverages Snowflake's internal secret management APIs\n\n#### ClosestFacilityProcessor Class\n- **Purpose**: Specialized processor for closest facility network analysis\n- **Capabilities**: \n - Spatial dataframe creation and manipulation\n - Network analysis execution with ArcGIS services\n - Result processing and optimization\n - Automated web map publishing\n\n### 5.3 ๐Ÿ” Key Technical Features\n\n#### Network Analysis Implementation\n- **Algorithm**: ArcGIS `find_closest_facilities()` network analysis\n- **Travel Mode**: Driving time optimization with traffic consideration\n- **Constraints**: Distance/time cutoffs for realistic assignments\n- **Scalability**: Handles multiple incidents and technicians simultaneously\n\n#### Data Processing Pipeline\n- **Input**: Snowflake tables with technician and incident locations\n- **Processing**: Spatial dataframe conversion and geometric operations\n- **Output**: Optimized assignments with detailed routing information\n- **Storage**: Results persisted in structured Snowflake tables\n\n#### Web Publishing Automation\n- **Feature Layers**: Automatic creation of interactive map layers\n- **Web Maps**: Comprehensive dashboard with all analysis results\n- **Symbology**: Professional cartographic styling for business users\n- **Sharing**: Configurable access permissions and organizational sharing"
},
{
"cell_type": "code",
"execution_count": null,
"id": "371544d4-aa31-42c6-b481-2ab7f5dc0f58",
"metadata": {
"codeCollapsed": false,
"collapsed": false,
"language": "python",
"name": "define_spatial_processing_functionality"
},
"outputs": [],
"source": "\"\"\"\nField Service Optimization: Closest Facility Analysis Demo\n\nThis script demonstrates the implementation of ArcGIS Network Analysis \"Find Closest Facilities\"\nfunctionality for field service optimization scenarios. It finds the optimal assignment of \nfield technicians to service incidents based on travel time and distance.\n\nOverview:\n---------\nThe script implements a complete workflow for field service dispatch optimization:\n1. Loads field technician locations and service incident data from Snowflake\n2. Uses ArcGIS Network Analysis to find the closest available technician for each incident\n3. Calculates optimal routes with travel time and distance metrics\n4. Publishes results as interactive web maps and feature layers in ArcGIS Online/Portal\n\nKey Features:\n-------------\n- **Closest Facility Analysis**: Leverages ArcGIS Network Analyst to solve routing problems\n- **Multi-modal Integration**: Combines Snowflake data storage with ArcGIS spatial analysis\n- **Web Publishing**: Automatically creates interactive maps and feature services\n- **Comprehensive Logging**: Detailed logging for monitoring and debugging\n\nUse Cases:\n----------\n- Emergency response dispatch (police, fire, medical)\n- Field service technician assignment\n- Delivery route optimization\n- Facility location analysis\n- Resource allocation planning\n\nTechnical Implementation:\n------------------------\nBased on ArcGIS API for Python Network Analysis capabilities:\n- Uses `arcgis.network.analysis.find_closest_facilities()` for core routing logic\n- Implements spatial dataframes for geographic data handling\n- Integrates with Snowflake for enterprise data management\n- Publishes results to ArcGIS Online/Portal for visualization\n\nData Flow:\n----------\n1. Snowflake โ†’ Load technician and incident locations\n2. ArcGIS โ†’ Perform closest facility network analysis\n3. Processing โ†’ Calculate routes, distances, and assignments\n4. Snowflake โ†’ Store results in assignment tables\n5. ArcGIS Online โ†’ Publish interactive maps and feature layers\n\nRequirements:\n-------------\n- ArcGIS API for Python (arcgis package)\n- Snowflake Snowpark for Python\n- Valid ArcGIS Online/Portal credentials\n- Network Analysis service access\n- Pandas, GeoPandas for data manipulation\n\nReference Documentation:\n------------------------\nArcGIS Find Closest Facilities Guide:\nhttps://developers.arcgis.com/python/latest/guide/part4-find-closest-facilities/\n\"\"\"\n\nimport sys\nfrom datetime import datetime\nimport pandas as pd\nimport pandas as pd\nimport json\nimport datetime as dt\n\nclass ArcGISProcessorBase:\n \"\"\"\n A base class, with common functionality, this will be used by all the classes that \n will be used to demonstrate the capabilities of interfacing with feature layer of ArcGIS Online.\n \"\"\"\n \n def __init__(self ,p_session: Session ,p_arcgis_portal: str):\n # Initialize instance variables\n #TODO: >>>>>>>>>>>>>> This url should not be hardcoded <<<<<<<<<<<<\n self.portal_url = p_arcgis_portal\n self.session = p_session\n self.credentials = {}\n self.gis = None\n \n def get_arcgis_credentials(self):\n \"\"\"Get ArcGIS credentials from stored secrets\"\"\"\n print('Getting ArcGIS credentials from Snowflake secrets...')\n\n import _snowflake\n \n try:\n username_password_object = _snowflake.get_username_password('esri_api_key');\n self.credentials[\"Username\"] = username_password_object.username\n self.credentials[\"Password\"] = username_password_object.password\n print(\"Successfully retrieved credentials from secrets\")\n except Exception as e:\n print(f\"Failed to get credentials from secrets: {e}\")\n # raise the error\n raise Exception(f\"Failed to get credentials from secrets: {e}\")\n \n return self.credentials\n \n def connect_to_arcgis(self):\n from arcgis.gis import GIS\n \n \"\"\"Connect to ArcGIS Online/Portal\"\"\"\n user_name = self.credentials['Username']\n user_pass = self.credentials['Password']\n print(f\"Connecting to ArcGIS at {self.portal_url}...\")\n print(f\"Using username: {user_name}\")\n \n try:\n self.gis = GIS(url=self.portal_url, username=user_name, password=user_pass)\n print(f\"Successfully connected to {self.portal_url}\")\n print(f\"Logged in to ArcGIS as {user_name}\")\n return self.gis\n except Exception as e:\n print(f\"ArcGIS login failed: {e}\")\n print(f\"Login failed: {e}\")\n print(\"Please check your credentials and URL\")\n self.gis = None\n return self.gis\n \nclass ClosestFacilityProcessor(ArcGISProcessorBase):\n \"\"\"\n A class to handle closest facility processing with DEBUG logging specific to this class.\n \"\"\"\n # from arcgis.features import GeoAccessor, GeoSeriesAccessor\n # from arcgis.gis import GIS\n # from arcgis.features import FeatureLayer, Feature, FeatureSet, use_proximity, FeatureCollection\n # from arcgis.network import analysis\n \n def __init__(self,p_session: Session,p_arcgis_portal: str):\n super().__init__(p_session,p_arcgis_portal)\n print(\"ClosestFacilityProcessor initialized\")\n \n def load_technician_data(self):\n \"\"\"Load technician data from Snowflake\"\"\"\n from arcgis.features import GeoAccessor, GeoSeriesAccessor\n \n print(\"Starting to load technician data from Snowflake...\")\n \n try:\n df = self.session.table('field_technician').to_pandas()\n \n df['ID'] = df['FIELD_TECHNICIAN_ID']\n df['NAME'] = df['FIELD_TECHNICIAN_ID']\n \n # Create spatial dataframe\n field_technician_sdf = pd.DataFrame.spatial.from_xy(\n df, \n x_column='longitude'.upper(), \n y_column='latitude'.upper()\n )\n \n print(f\"Created spatial dataframe with {len(field_technician_sdf)} technicians\")\n \n return field_technician_sdf\n \n except Exception as e:\n print(f\"Error loading technician data: {e}\")\n raise\n \n def load_incidents_data(self, limit=10):\n \"\"\"Load incidents data from Snowflake\"\"\"\n from arcgis.features import GeoAccessor, GeoSeriesAccessor\n print(f\"Starting to load incidents data (limit: {limit})...\")\n \n try:\n df = self.session.table('field_incidents').limit(limit).to_pandas()\n \n df['ID'] = df['INCIDENT_ID']\n df['NAME'] = df['INCIDENT_ID']\n \n # Create spatial dataframe\n field_incidents_sdf = pd.DataFrame.spatial.from_xy(\n df, \n x_column='longitude'.upper(), \n y_column='latitude'.upper()\n )\n \n print(f\"Created spatial dataframe with {len(field_incidents_sdf)} incidents\")\n \n return field_incidents_sdf\n \n except Exception as e:\n print(f\"Error loading incidents data: {e}\")\n raise\n \n def find_closest_technician_to_incident(self, field_technician_sdf, field_incidents_sdf):\n \"\"\"Find closest technician to incident\"\"\"\n from arcgis.features import GeoAccessor, GeoSeriesAccessor\n from arcgis.network import analysis\n \n print(\"Starting to find closest technician to incident...\")\n \n print(\"Finding closest technician to incident ...\")\n field_technician_fs = field_technician_sdf.spatial.to_featureset()\n incidents_fs = field_incidents_sdf.spatial.to_featureset()\n\n print(f\"Number of facilities (technicians): {len(field_technician_fs)}\")\n print(f\"Number of incidents: {len(incidents_fs)}\")\n\n closest_facility_result = None\n # current_time = dt.datetime.now()\n try:\n # Perform closest facility analysis\n closest_facility_result = analysis.find_closest_facilities(\n incidents=incidents_fs, # Origins (incidents)\n facilities=field_technician_fs, # Destinations (technicians) \n number_of_facilities_to_find=1, # Find 1 closest technician per incident\n cutoff=50, # Maximum distance/time cutoff (km or minutes)\n # time_of_day=current_time, # Current time for traffic consideration\n travel_mode=\"Driving Time\", # Travel mode\n # time_zone_for_time_of_day=\"America/Chicago\", # Austin timezone\n save_output_network_analysis_layer=False, # Don't save as service\n # return_facilities=True, # Return facility info\n # return_incidents=True, # Return incident info \n # return_routes=True, # Return route geometry\n gis = self.gis # GIS connection\n )\n \n print(\"โœ… Closest facility analysis completed successfully!\")\n print(f\"Solve succeeded: {closest_facility_result.solve_succeeded}\")\n except Exception as e:\n print(f\"โŒ Error in closest facility analysis: {e}\")\n return closest_facility_result\n return closest_facility_result\n\n def process_closest_facility_result(self, closest_facility_result):\n \"\"\"Process closest facility result\"\"\"\n print(\"Processing closest facility result ...\")\n routes_sdf = closest_facility_result.output_routes.sdf\n facilities_sdf = closest_facility_result.output_closest_facilities.sdf \n incidents_sdf = closest_facility_result.output_incidents.sdf\n\n print(\"========== Results Summary: ==========\")\n print(f\"Routes found: {len(routes_sdf)}\")\n print(f\"Facilities assigned: {len(facilities_sdf)}\")\n print(f\"Incidents processed: {len(incidents_sdf)}\")\n\n return routes_sdf, facilities_sdf, incidents_sdf\n\n def calculate_technician_to_incident_assignment_route(self, routes_sdf):\n \"\"\"Calculate technician to incident assignment route\"\"\"\n print(\"Calculating technician to incident assignment route ...\")\n routes_enhanced_df = routes_sdf.copy()\n routes_enhanced_df['Route_Name'] = routes_enhanced_df.apply(\n lambda row: f\"Tech-{row['FacilityID'][:8]} โ†’ Incident-{row['IncidentID'][:8]}\", axis=1\n )\n routes_enhanced_df['Distance_KM'] = routes_enhanced_df['Total_Kilometers'].round(2)\n routes_enhanced_df['Travel_Time_Min'] = routes_enhanced_df['Total_Minutes'].round(1)\n routes_enhanced_df['SHAPE_WKT'] = routes_enhanced_df['SHAPE'].apply(lambda geom: geom.WKT if geom else None)\n\n try:\n print(\"Writing to table: technician_incidents_assignment_routes ...\")\n self.session.write_pandas(routes_enhanced_df, 'technician_incidents_assignment_routes' \n ,quote_identifiers=False ,auto_create_table=True \n ,overwrite=True ,table_type='transient'\n ,use_logical_type = True\n )\n print(\"Successfully written to table: technician_incidents_assignment_routes\")\n except Exception as e:\n print(f\"Failed writing to table: {e}\")\n\n return routes_enhanced_df\n\n def create_assignment_summary(self, routes_sdf):\n \"\"\"Create assignment summary\"\"\"\n print(\"Creating assignment summary ...\")\n assignment_df = None\n assignments = []\n for _, route in routes_sdf.iterrows():\n assignments.append({\n 'Incident_ID': route['IncidentID'],\n 'Assigned_Technician': route['FacilityID'],\n 'Distance_KM': round(route['Total_Kilometers'], 2),\n 'Travel_Time_Min': round(route['Total_Minutes'], 1)\n })\n\n assignment_df = pd.DataFrame(assignments)\n\n try:\n print(\"Writing to table: incident_to_technician_assignment ...\")\n self.session.write_pandas(assignment_df, 'incident_to_technician_assignment' \n ,quote_identifiers=False ,auto_create_table=True \n ,overwrite=True ,table_type='transient')\n print(\"Successfully written to table: incident_to_technician_assignment\")\n except Exception as e:\n print(f\"Failed writing to table: {e}\")\n\n return assignment_df\n\n def publish_feature_layer(self, sdf, layer_name, title_suffix=\"\"):\n \"\"\"\n Publish a spatial dataframe as a feature layer to ArcGIS Online\n \"\"\"\n from arcgis.features import GeoAccessor, GeoSeriesAccessor\n try:\n # Convert to feature layer\n feature_layer = sdf.spatial.to_featurelayer(\n gis=self.gis,\n title=f\"{layer_name}{title_suffix}\",\n tags=[\"field service\", \"incidents\", \"technicians\", \"routing\"],\n description=f\"Auto-generated {layer_name} layer for field service optimization\",\n sanitize_columns=True # For handling scenarios where column name size > 10\n )\n \n print(f\"โœ… Successfully published {layer_name} as feature layer\")\n print(f\" Layer ID: {feature_layer.id}\")\n print(f\" URL: {feature_layer.url}\")\n return feature_layer\n \n except Exception as e:\n print(f\"โŒ Failed to publish {layer_name}: {e}\")\n return None\n\n def publish_incidents_as_feature_layer(self, field_incidents_sdf):\n \"\"\"Publish incidents layer\"\"\"\n print(\"Publishing incidents as featurelayer ...\")\n\n # Need to do this, as the column name cannot be more than 10 characters\n field_incidents_sdf.drop(['INCIDENT_ID'] , axis=1,inplace=True)\n column_mapping = {\n 'INCIDENT_ADDRESS': 'ADDRESS', \n 'INCIDENT_DESCRIPTION': 'INCIDENT'\n }\n field_incidents_sdf.rename(columns=column_mapping, inplace=True)\n\n # convert to spatial dataframe\n # field_incidents_sdf = pd.DataFrame.spatial.from_xy(df, x_column='longitude'.upper(), y_column='latitude'.upper())\n\n # 1. Publish incidents layer\n feature_layer_name = f\"Field_Service_Incidents_{datetime.now().strftime('%Y%m%d_%H%M%S')}\"\n incidents_feature_layer = self.publish_feature_layer(\n field_incidents_sdf, \n feature_layer_name,\n title_suffix=\" - Austin TX\"\n )\n\n # ----- Optional: Set the extent of the feature layer -----\n # set the center of the feature layer to the first point\n # Define a new, small extent around the first point\n # Adjust the buffer (0.001) as needed for your data's scale\n first_point_geom = field_incidents_sdf.iloc[0].SHAPE\n x, y = first_point_geom.x, first_point_geom.y\n extent = f'[{x - 0.001}, {y - 0.001}, {x + 0.001}, {y + 0.001}]'\n # Update the published item's extent\n incidents_feature_layer.update({'extent': extent})\n\n return incidents_feature_layer\n\n def publish_technicians_as_feature_layer(self, field_technician_sdf):\n \"\"\"Publish technicians layer\"\"\"\n from arcgis.features import GeoAccessor, GeoSeriesAccessor\n print(\"Publishing technicians as featurelayer ...\")\n \n field_technician_sdf.drop(['FIELD_TECHNICIAN_ID'] , axis=1,inplace=True)\n column_mapping = {\n 'FIELD_TECHNICIAN_ADDRESS': 'ADDRESS'\n }\n field_technician_sdf.rename(columns=column_mapping, inplace=True)\n\n # convert to spatial dataframe\n field_technician_sdf = pd.DataFrame.spatial.from_xy(\n field_technician_sdf, x_column='longitude'.upper(), y_column='latitude'.upper())\n\n # 2. Publish technicians layer\n feature_layer_name = f\"Field_Technicians_{datetime.now().strftime('%Y%m%d_%H%M%S')}\"\n technicians_layer = self.publish_feature_layer(\n field_technician_sdf, \n feature_layer_name,\n title_suffix=\" - Austin TX\"\n )\n\n # ----- Optional: Set the extent of the feature layer -----\n # set the center of the feature layer to the first point\n # Define a new, small extent around the first point\n # Adjust the buffer (0.001) as needed for your data's scale\n first_point_geom = field_technician_sdf.iloc[0].SHAPE\n x, y = first_point_geom.x, first_point_geom.y\n extent = f'[{x - 0.001}, {y - 0.001}, {x + 0.001}, {y + 0.001}]'\n # Update the published item's extent\n technicians_layer.update({'extent': extent})\n\n return technicians_layer\n\n def publish_routes_as_feature_layer(self, routes_sdf):\n \"\"\"Publish routes layer\"\"\"\n print(\"Publishing routes as featurelayer ...\")\n \n # The routes are stored in Snowflake as WKT string, hence there is a special handling that needs to\n # be followed to convert to a spatial dataframe\n\n from shapely import wkb, wkt\n from geopandas import GeoDataFrame\n from arcgis.features import GeoAccessor, GeoSeriesAccessor\n\n # Load dataframe from table\n route_df = (self.session.table('technician_incidents_assignment_routes')\n .select('OBJECTID','NAME','FACILITYID','INCIDENTID','ROUTE_NAME','SHAPE_WKT')\n .to_pandas())\n\n # Convert dataframe into a spatial dataframe.\n # Convert WKT strings to geometry objects\n route_df['SHAPE'] = route_df['SHAPE_WKT'].apply(wkt.loads)\n\n # dropping this column as i dont need it anymore\n route_df.drop(['SHAPE_WKT'] , axis=1,inplace=True, errors='ignore')\n\n gdf = GeoDataFrame(route_df, crs=\"EPSG:3435\", geometry=route_df['SHAPE'])\n routes_sdf = GeoAccessor.from_geodataframe(gdf)\n routes_sdf.head()\n\n # 3. Publish routes layer (from closest facility analysis)\n feature_layer_name = f\"Optimized_Routes_{datetime.now().strftime('%Y%m%d_%H%M%S')}\"\n routes_feature_layer = self.publish_feature_layer(\n routes_sdf, \n feature_layer_name, \n title_suffix=\" - Austin TX\"\n )\n return routes_feature_layer\n\n def publish_webmap_with_layers(self, incidents_layer, technicians_layer, routes_layer):\n \"\"\"\n Create a webmap with incidents, technicians, and routes layers\n \"\"\"\n try:\n from arcgis.mapping import WebMap\n from arcgis.widgets import MapView\n \n # Create a new webmap\n webmap = WebMap()\n \n # Add basemap\n webmap.basemap = \"streets-vector\" # Modern vector basemap\n \n # Add layers with proper symbology\n if incidents_layer:\n webmap.add_layer(incidents_layer, options={\n \"title\": \"Service Incidents\",\n \"opacity\": 0.8\n })\n \n if technicians_layer:\n webmap.add_layer(technicians_layer, options={\n \"title\": \"Field Technicians\", \n \"opacity\": 0.8\n })\n \n if routes_layer:\n webmap.add_layer(routes_layer, options={\n \"title\": \"Optimized Routes\",\n \"opacity\": 0.7\n })\n \n # Set map extent to Austin area\n webmap.definition.operationalLayers\n \n # Save the webmap\n map_title = \"Austin Field Service Operations Dashboard\"\n webmap_properties = {\n \"title\": map_title,\n \"snippet\": \"Interactive map showing field service incidents, technician locations, and optimized routes\",\n \"description\": \"This map displays real-time field service operations including incident locations, available technicians, and optimal routing solutions.\",\n \"tags\": [\"field service\", \"optimization\", \"routing\", \"incidents\", \"technicians\"],\n \"type\": \"Web Map\"\n }\n \n saved_webmap = webmap.save(item_properties=webmap_properties)\n \n print(f\"โœ… Successfully created webmap: {map_title}\")\n print(f\" Map ID: {saved_webmap.id}\")\n print(f\" Map URL: {self.portal_url}home/webmap/viewer.html?webmap={saved_webmap.id}\")\n \n return saved_webmap\n \n except Exception as e:\n print(f\"โŒ Failed to create webmap: {e}\")\n return None\n\ndef calculate_closest_facility(p_session: Session ,p_arcgis_portal: str):\n # Create a general logger for the main execution\n print(\"Starting Closest Facility Processing Application\")\n ret = {}\n # Create and use the ClosestFacilityProcessor class\n processor = ClosestFacilityProcessor(p_session,p_arcgis_portal)\n \n try:\n # Get credentials and connect to ArcGIS\n processor.get_arcgis_credentials()\n processor.connect_to_arcgis()\n ret['connected_to_arcgis'] = True\n \n # Load data\n field_technician_sdf = processor.load_technician_data()\n field_incidents_sdf = processor.load_incidents_data()\n ret['loaded_data'] = True\n \n closest_facility_result = processor.find_closest_technician_to_incident(field_technician_sdf, field_incidents_sdf)\n ret['calculated_closest_facility'] = True\n\n if closest_facility_result is None:\n ret['was_able_find_closest_facility'] = False\n print(\"Unable to find closest technician to incident\")\n raise Exception(\"Unable to find closest technician to incident\")\n elif (closest_facility_result and closest_facility_result.solve_succeeded) == False:\n ret['was_able_find_closest_facility'] = False\n print(\"โŒ No routes available for publishing\")\n print(\"๐ŸŒ Your field service optimization map is not available online, as no routes were found!\")\n raise Exception(\"No routes available for publishing\")\n\n ret['was_able_find_closest_facility'] = True\n routes_sdf, facilities_sdf, incidents_sdf = processor.process_closest_facility_result(closest_facility_result)\n routes_enhanced_df = processor.calculate_technician_to_incident_assignment_route(routes_sdf)\n assignment_df = processor.create_assignment_summary(routes_sdf)\n ret['processed_and_stored_in_tables'] = True\n\n # ---- Publish to feature layer ----\n incidents_feature_layer = processor.publish_incidents_as_feature_layer(field_incidents_sdf)\n technicians_feature_layer = processor.publish_technicians_as_feature_layer(field_technician_sdf)\n routes_feature_layer = processor.publish_routes_as_feature_layer(routes_sdf)\n webmap = processor.publish_webmap_with_layers(incidents_feature_layer, technicians_feature_layer, routes_feature_layer)\n ret['Uploaded_to_arcgis_portal'] = True\n\n print(\"Application completed successfully!\")\n ret['success'] = True\n \n except Exception as e:\n ret['success'] = False\n print(f\"Application failed: {e}\")\n raise\n\n print(\"Finished!!!\")\n return ret\n\n# -----------------------------\n# Register the stored procedure\n# Doc: https://docs.snowflake.com/en/developer-guide/snowpark/reference/python/1.29.0/snowpark/api/snowflake.snowpark.functions.sproc\n#\nsproc_calculate_closest_facility = sp_session.sproc.register(\n func = calculate_closest_facility\n ,return_type = T.VariantType()\n ,name = 'calculate_closest_facility_sproc' \n ,replace = True ,is_permanent=True,stage_location='@arcgis_lib_stg'\n ,packages=[\"arcgis\", \"pyshp\", \"shapely\", \"geopandas\",\"streamlit\"] #, \"pyshp\", \"shapely\", \"geopandas\"\n ,external_access_integrations=[ \"eai_esri_api\"]\n ,artifact_repository = 'snowflake.snowpark.pypi_shared_repository'\n ,secrets = {\n 'esri_api_key' : 'arcgis_api_key'\n }\n)\n"
},
{
"cell_type": "markdown",
"id": "6acb2ede-dc5b-4db1-b592-278c46ee0319",
"metadata": {
"name": "cell11",
"collapsed": false
},
"source": [
"## 6. ๐Ÿš€ Execution: Running the Spatial Analysis\n",
"\n",
"This section demonstrates how to invoke the stored procedure to perform the closest facility analysis. The execution includes data preparation, analysis execution, and result verification.\n",
"\n",
"### 6.1 ๐Ÿงน Data Preparation\n",
"\n",
"Before running the analysis, existing result tables are cleaned up to ensure fresh results and prevent data conflicts.\n",
"\n",
"### 6.2 ๐Ÿ“ค Stored Procedure Invocation\n",
"\n",
"The stored procedure call executes the complete workflow:\n",
"- Loads technician and incident data from Snowflake tables\n",
"- Performs network analysis using ArcGIS services\n",
"- Calculates optimal assignments and routes\n",
"- Stores results in output tables\n",
"- Publishes interactive web maps to ArcGIS Online\n",
"\n",
"### 6.3 โšก Expected Processing Flow\n",
"\n",
"1. **Authentication**: Secure connection to ArcGIS services\n",
"2. **Data Loading**: Retrieval of spatial data from Snowflake\n",
"3. **Network Analysis**: Closest facility calculations\n",
"4. **Route Optimization**: Detailed routing with travel metrics\n",
"5. **Result Storage**: Output tables creation in Snowflake\n",
"6. **Web Publishing**: Interactive map and feature service creation\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "21886ac6-7723-4707-8b43-f552dcbb1176",
"metadata": {
"language": "sql",
"name": "target_table_cleanup"
},
"outputs": [],
"source": [
"drop table if exists technician_incidents_assignment_routes;\n",
"drop table if exists incident_to_technician_assignment;"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "756ee612-9510-4f47-afa2-da0249c80fdc",
"metadata": {
"language": "sql",
"name": "invoke_spatial_processing_job"
},
"outputs": [],
"source": [
"call calculate_closest_facility_sproc('https://{{arcgis_portal_url}}/')"
]
},
{
"cell_type": "markdown",
"id": "3d0a8b16-d03f-42b4-831f-3d6a8afc27d1",
"metadata": {
"name": "cell19"
},
"source": [
"## 6. ๐Ÿ“Š Results Analysis and Data Exploration\n",
"\n",
"Explore the results generated by the spatial analysis job. The job creates several output tables in Snowflake containing detailed assignment information, route data, and performance metrics.\n",
"\n",
"### Output Tables Created\n",
"\n",
"1. **`technician_incidents_assignment_routes`**: \n",
" - Complete route geometries and detailed routing information\n",
" - Travel distances, times, and turn-by-turn directions\n",
" - Route shapes stored in both binary (SHAPE) and text (SHAPE_WKT) formats\n",
"\n",
"2. **`incident_to_technician_assignment`**: \n",
" - Summary of technician assignments to incidents\n",
" - Key performance metrics: distance, travel time, assignment efficiency\n",
" - Optimized for reporting and dashboard creation\n",
"\n",
"### Key Performance Indicators\n",
"\n",
"- **Response Time Optimization**: Average travel time to incidents\n",
"- **Resource Utilization**: Workload distribution across technicians\n",
"- **Geographic Coverage**: Spatial distribution of assignments\n",
"- **Efficiency Metrics**: Distance vs. time trade-offs in assignments\n",
"\n",
"### Sample Queries\n",
"\n",
"Use the following SQL queries to explore your results:\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "ce110000-1111-2222-3333-ffffff000000",
"metadata": {
"language": "sql",
"name": "sample_output_1",
"vscode": {
"languageId": "sql"
}
},
"outputs": [],
"source": [
"\n",
"-- NOTE: Please switch this to SQL cell.\n",
"\n",
"-- ==========================================\n",
"-- DETAILED ROUTE ANALYSIS\n",
"-- ==========================================\n",
"\n",
"-- Explore the detailed route information table\n",
"-- This table contains the complete routing solution with geometry data\n",
"SELECT \n",
" OBJECTID, -- Unique route identifier\n",
" NAME as ROUTE_NAME, -- Human-readable route description\n",
" FACILITYID as TECHNICIAN_ID, -- Assigned technician identifier\n",
" INCIDENTID as INCIDENT_ID, -- Target incident identifier\n",
" ROUTE_NAME as ASSIGNMENT, -- Formatted assignment description\n",
" DISTANCE_KM, -- Total route distance in kilometers\n",
" TRAVEL_TIME_MIN, -- Estimated travel time in minutes\n",
" -- SHAPE, -- Binary geometry (uncomment to view)\n",
" LEFT(SHAPE_WKT, 100) as SHAPE_PREVIEW -- Preview of route geometry in WKT format\n",
"FROM technician_incidents_assignment_routes\n",
"ORDER BY DISTANCE_KM ASC -- Show shortest routes first\n",
"LIMIT 5;\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "ce110000-1111-2222-3333-ffffff000001",
"metadata": {
"language": "sql",
"name": "sample_output_2",
"vscode": {
"languageId": "sql"
}
},
"outputs": [],
"source": [
"\n",
"-- NOTE: Please switch this to SQL cell.\n",
" \n",
"-- ==========================================\n",
"-- ASSIGNMENT SUMMARY AND PERFORMANCE METRICS\n",
"-- ==========================================\n",
"\n",
"-- NOTE: Please switch this to SQL cell.\n",
"\n",
"-- Analyze workload distribution and performance metrics by technician\n",
"-- This query provides insights into resource utilization and optimization efficiency\n",
"SELECT\n",
" ASSIGNED_TECHNICIAN, -- Technician identifier\n",
" COUNT(INCIDENT_ID) as INCIDENTS_ASSIGNED, -- Number of incidents assigned\n",
" ROUND(SUM(DISTANCE_KM), 2) as TOTAL_DISTANCE_KM, -- Total travel distance\n",
" ROUND(AVG(DISTANCE_KM), 2) as AVG_DISTANCE_KM, -- Average distance per incident\n",
" ROUND(SUM(TRAVEL_TIME_MIN), 1) as TOTAL_TRAVEL_TIME_MIN, -- Total travel time\n",
" ROUND(AVG(TRAVEL_TIME_MIN), 1) as AVG_TRAVEL_TIME_MIN, -- Average response time\n",
" ROUND(SUM(DISTANCE_KM) / SUM(TRAVEL_TIME_MIN) * 60, 1) as AVG_SPEED_KMH -- Overall average speed\n",
"FROM incident_to_technician_assignment\n",
"GROUP BY ASSIGNED_TECHNICIAN\n",
"ORDER BY INCIDENTS_ASSIGNED DESC, AVG_TRAVEL_TIME_MIN ASC; -- Show busiest technicians with best response times first\n"
]
},
{
"cell_type": "markdown",
"id": "650a35bd-bd41-4e6b-906a-f8c05280b670",
"metadata": {
"name": "cell12",
"collapsed": false
},
"source": "---\n\n# ๐ŸŽ‰ Conclusion\n\nThis demonstration showcases the transformative potential of integrating ArcGIS spatial analysis capabilities directly within Snowflake using stored procedures. The solution eliminates traditional architectural constraints while delivering enterprise-grade security, performance, and scalability.\n\n## Key Takeaways\n\n1. **No intermediary external data movement**: Complete spatial analysis originating from Snowflake to ArcGIS\n2. **Serverless Architecture**: No infrastructure overhead or maintenance complexity\n3. **Automated Publishing**: Direct creation of interactive web maps and services"
}
]
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment