Search

Enterprise Data Architect

Apex Informatics
locationTallahassee, FL, USA
PublishedPublished: 5/3/2026
Full time
Title: Enterprise Data Architect

Location:

Job Description:

Key Responsibilities
  • Design and implement a data integration framework to support enterprise data.
  • Guide the transition and integration of enterprise data to unify various asset domains.
  • Join location data with engineering datasets.
  • Orchestrate extract transform and load (ETL) pipelines, application programming interfaces (APIs), and service endpoints.
  • Provide interoperable data flows across statewide enterprise systems and enforce lineage, stewardship, and "collect once, use many" principles.
  • Blueprint and standardize data schemas.
  • Review as-is business processes, remediation strategies, reengineering, design, and integration.
  • Ensure compatibility with cloud architecture and align with the state's cloud-first policy.
  • Analyze and remediate data that is secure and scalable using platforms such as Azure, Snowflake, Informatica, and PostgreSQL.
  • Verify data quality once data is integrated and provisioned.

Current State Assessment
  • Review existing systems to identify data silos, schema inconsistencies, and governance gaps.
  • Engage stakeholders to gather and document integration needs.
  • Analyze and verify existing data models to assess impacts related to a modern data model.
  • Evaluate dependencies and scope of necessary changes.
  • Conduct impact analysis on reporting and application usage.

Framework Design and Architecture
  • Define enterprise data architecture strategy including canonical models, APIs, and ontologies.
  • Map integration points across Azure, Snowflake, Informatica, and PostgreSQL.
  • Develop conceptual, logical, and physical data models.
  • Re-design data models for enterprise domains.
  • Ensure spatial data integration with GIS systems.

Data Governance and Quality
  • Implement governance structures, metadata standards, and stewardship roles.
  • Design and implement data quality measures.
  • Automate and streamline ETL processes.
  • Create metadata standards, lineage tracking, and stewardship roles.

Data Integration and Interoperability
  • Build ETL/ELT pipelines and real-time streaming solutions.
  • Ensure consistent identifiers enterprise application-wide.
  • Enable internal and external data exchange.
  • Implement orchestration using Informatica, Azure Data Factory, or Snowflake.
  • Design RESTful endpoints for spatial and engineering data.

Technical Leadership and Strategic Support
  • Collaborate with Transportation Technology leaders on enterprise data initiatives.
  • Provide guidance on design choices, standards, tools, and platforms.
  • Act as an internal consultant, mentor, and advocate for change management.

Deliverables
  • Technical Memorandum on current state assessment.
  • Data models, schemas, architecture diagrams, dictionaries, and API specs.
  • Technical Memorandum on data quality measures and ETL enhancements.
  • Governance Framework and Metadata standards.
  • Implementation Roadmap.
  • Implementation playbook and training materials.
  • Regular stakeholder updates and presentations.

Required Qualifications
  • Minimum of 7 years of experience with large and complex database management systems.
  • Minimum of 7 years of experience as a data architect or senior data analyst.
  • 10+ years of experience with DBMS such as Oracle, SQL Server, or DB2.
  • Strong background in financial data systems.
  • Proven ETL, data warehousing, and automation experience.
  • Experience with data visualization tools.
  • Familiarity with cloud-based data integration solutions.

Preferred Qualifications
  • Experience with DB2 systems.
  • Experience with Informatica or other modern data integration platforms.
  • Strong analytical skills.
  • Excellent communication and collaboration skills.