Data Aces is looking for a Data Warehouse Engineer to work on projects to build/migrate legacy data warehouses and data marts to a state-of-the-art, cloud based, data lake and data warehouse systems.
The Candidate should be able to code in Scala, Python, Spark and possess good understanding of cloud ecosystem, particularly Amazon Web Services.
• Experience in Data Lake Implementation in the cloud for large data warehouses.
• Experience in coding using Scala, Spark. NoSQL database desirable.
• Design and build ETL process and rules to extract, transform and load data into the data warehouse; take ownership of handling issues, deliver working code to tight timelines
• Design and update data models, schemas and patterns. Work collaboratively with rest of the team in coming up with design and changes to other parts of the application.
• Design QC framework and validations to automate ETL validations
• Lead specific customization projects, including code reviews and quality of deliverables.
• Identify and recommend improvements in the application solutions in support of data warehouse applications and business intelligence reporting (ie, data quality, performance, static and dynamic reports, etc.)
• Researches, manages and coordinates resolution of complex issues through root cause analysis as appropriate
• Should be able to benchmark systems, analyze system bottlenecks and propose solutions to eliminate them, especially Data lake implementation.
• Should be able to clearly articulate pros and cons of various technologies and platforms;
• To be able to document use cases, solutions and recommendations;
• Should have excellent written and verbal communication skills;
• Should be able to perform detailed analysis of business problems and technical environments and use this in designing the solution;
• should be able to build and mentor the team around new technologies.
• Should be able to work in teams, as a big data environment is developed in a team of employees with different disciplines;
• Should be able to work in a fast-paced agile development environment.
• Should be able to drive the Big Data consulting for different clients/needs.
Master’s degree in Information Technology. Bachelor’s may be considered with relevant experience.
Proven experience working with an off shore support team
Previous experience working in a high paced, high pressure agile development environment
Proven analytical and problem solving skills for assessment of complex requirements, strong coding skills
Natural technical and functional curiosity
Excellent communication skills
Knowledge of data science – python, R, scikit-learn, tensorflow, neural networks – is a bonus
SQL, Spark, Scala, HDFS, Python and other Big Data Technologies.
Good knowledge / implementation experience, data modeling in OLAP and DW
Experience with various databases (SQL , NoSQL)