Are you ready to elevate your data engineering skills to the next level?
Amazon is seeking a data engineer with deep love for data, who will play a pivotal role in re-architecting our data universe, crafting dynamic pipelines that power our business customers and proper us into the future.
About the team
S&A (Standardization & Automation) views technology as a true enabler for Finance as data will be available at various grains for analysis and planning. S&A believes that all planning should be automated and driven with business led inputs and automatic calculations to enable comparison of various scenarios. Finance should have the ability to service both the ad-hoc needs of the business (with the speed necessary for business execution) and drive step function changes with technology incorporated into financial processes. In addition, S&A believes that new finance team members should be onboarded with a comprehensive onboarding program that imparts the knowledge and skills needed to execute in their role. S&A improves efficiency for users by delivering time-saving products and providing standardization frameworks to eliminate variance or subjectivity. These efficiencies are measured by hours saved from customers when comparing the old process against S&A’s improvements.
As a Data Engineer in WWSNA, you will be working in one of the world's largest and most complex data warehouse environments. You will design, implement and support scalable data infrastructure solutions and implement complex data models for Amazon Inbound & Outbound Transportation business. As a DE you will create solutions to integrate with multi heterogeneous data sources, aggregate and retrieve data in a fast and safe mode, curate data that can be used in reporting, analysis, machine learning models and ad-hoc data requests. You should have excellent business and communication skills to be able to work with business owners, Product teams and Tech leaders to gather infrastructure requirements, design data infrastructure, build up data pipelines and data-sets to meet business needs. You will be responsible for designing, developing, and operating a data service platform using Python, Airflow, and SQL to build the various ETL, analytics, and data quality components. You’ll automate deployments using AWS CodeDeploy, AWS CodePipeline, AWS Cloud Development Kit (CDK), and AWS Cloud Formation. You will work with AWS services like Redshift, Glue, S3, IAM, CloudWatch, and more.
Basic Qualifications
Preferred Qualifications
What unites Amazon employees across teams and geographies is that we are all striving to delight our customers and make their lives easier.
The scope and scale of our mission drives us to seek diverse perspectives, be resourceful, and navigate through ambiguity. Inventing and delivering things that were never thought possible isn't easy, but we embrace this challenge every day.
By working together on behalf of our customers, we are building the future one innovative product, service, and idea at a time. Are you ready to embrace the challenge? Come build the future with us.
Tell them you found this role on Creative Tokyo :)
Continue