HyerTek is seeking a Data Engineer to build ETL pipelines, dataflows, and analytics foundations for federal government clients. You'll work with sources such as Oracle, Jira, Confluence, SharePoint, Planner, and legacy databases, migrating data to modern platforms and making it usable for reporting and insights for Microsoft technologies.
Requirements
- Build and maintain ETL pipelines using Azure Data Factory
- Develop Power Platform Dataflows for data transformation and loading
- Extract from diverse sources — Oracle, SQL Server, Jira, Confluence, SharePoint, Planner, REST APIs, flat files
- Write complex transformations using Data Factory expressions, Power Query M, and SQL
- Implement incremental load and change data capture patterns
- Schedule, monitor, and troubleshoot pipeline runs
- Build semantic models and datasets in Power BI and Microsoft Fabric
- Develop and optimize dataflows that feed Power BI reports
- Support report developers with data preparation and modeling
- Create Fabric lakehouses and data pipelines where applicable
- Write DAX measures and optimize dataset performance
- Ensure data quality and consistency across reporting layers