The Cloud Data Engineer is part of a Data Management team that is responsible for modernizing and transforming data and reporting capabilities across products by implementing a new modernized data architecture. The position will be responsible for day-to-day data collection, transportation, maintenance/curation, and access to the corporate data asset. The Cloud Data Engineer will work cross-functionally across the enterprise to centralize data and standardize it for use by business reporting, machine learning, data science or other stakeholders. This position plays a critical role in increasing the awareness about available data and democratizing access to it across our data partners.
Responsibilities
- Crafting and maintaining efficient data pipeline architecture.
- Assembling large, complex data sets that meet functional / non-functional business requirements.
- Create and maintain optimal data pipeline/flow architecture.
- Identifying, crafting, and implementing internal process improvements: automating manual processes, optimizing data delivery, redesigning infrastructure for greater scalability, etc.
- Working with technical and non-technical stakeholders to assist with data-related technical issues and support their data infrastructure needs.
- Working with the team to strive for clean and meaningful data, and greater functionality and flexibility within the team’s data systems.
- Design processes supporting data transformation, data structures, metadata, dependency, and workload management.
Qualifications
- Cloud data engineering experience in Azure.
- Advanced working knowledge of SQL and NoSQL query authoring.
- Experience working with streams such as Event Hubs and Event Driven Architectures.
- Experience building data engineering pipelines in Azure (ADF, Databricks, or Synapse).
- Experience with Microsoft Power Platform including Power BI and Power Apps.
- Proficiency with object-oriented/object function scripting languages: Python, C#, etc.
- Experience with big data tools: Databricks, Spark, etc.
- Experience building, maintaining, and optimizing ‘big data’ data pipelines, architectures, and data sets
- Experience cleaning, testing, and evaluating data quality from a wide variety of ingestible data sources.
- Strong project management, communication, and organizational skills.
- Bachelor’s degree or equivalent experience.