BURN is working to create a world where cooking positively impacts all life on earth, by producing the world’s most efficient cooking appliances.
About the Role:
BURN is seeking a skilled and experienced Data Engineer or Data Pipeline System Designer to develop & deploy the articulated solution. The Data Engineer role involves designing and deploying an end-to-end data pipeline system that centralizes data from various sources and enables data professionals to easily query the data. Additionally, the system should allow users to easily pull up all relevant information for a product and customer using a consumer-friendly user interface.
Duties and Responsibilities:
- Design and deploy an end-to-end data pipeline system that centralizes and processes large volumes of structured and unstructured data from various sources.
- Develop user-friendly interfaces that enable users to easily pull up relevant information for a product and customer.
- Collaborate with data scientists, data analysts, and other stakeholders to understand data requirements and design data solutions that meet their needs.
- Design and implement efficient data extraction, transformation, and loading (ETL) processes to populate the pipeline with data from various sources.
- Build and maintain robust data pipelines that ensure data is accurate, up-to-date, and easily accessible.
- Develop and maintain data models, data schemas, and data dictionaries.
- Use APIs, batch exports, and SQL queries to extract data from various sources and integrate it into a SQL database.
- Perform data cleaning, data transformation, and data integration tasks to ensure data quality and consistency.
- Collaborate with data analysts, data scientists, and other stakeholders to ensure data is processed and analyzed effectively.
- Monitor and optimize data pipelines to ensure they are performing efficiently.
Success for the role will be measured by delivering within the first few months of the following,
- Successful deployment of the end-to-end data pipeline system, including system implementation, ETL processes, and data handling capabilities within the first few months.
- Data accessibility and usability, are measured by ease of use of user-friendly interfaces and speed of accessing relevant information.
- Data quality and consistency, are monitored through data accuracy, completeness, consistency, and integrity.
- Collaboration and stakeholder satisfaction, are measured by feedback from stakeholders on the effectiveness of data solutions and maintaining positive working relationships.
Skills and Experience:
- Bachelor’s degree in computer science, Information Systems, or related field.
- At least 5 years of experience in designing and deploying end-to-end data pipelines.
- Strong knowledge of SQL, ETL tools, and data warehousing concepts.
- Experience working with APIs, batch exports, and SQL queries to extract data from various sources.
- Experience with cloud computing platforms such as AWS, Azure, or Google Cloud Platform.
- Strong data analysis and problem-solving skills.
- Experience working with Microsoft Dynamics, open-source data systems like KOBO, and Call Center platforms would be an added advantage.
- Excellent communication skills and ability to work in a team environment.
Are you looking to sharpen your Software Development skills to stay relevant in the market? CLICK HERE to have a look at the top schools.
For all your IT certification needs, please, click here for information on how to get started