Offers “Amazon”

Expires soon Amazon

Data Engineer

  • Bangalore (Bangalore Urban)
  • IT development

Job description



DESCRIPTION

Amazon.com operates in a virtual, global e-commerce environment without boundaries, and operates a diverse set of businesses, including retail, third party marketplaces, e-commerce platforms, web services for developers. Amazon's mission is to be earth's most customer-centric company.

Compliance Operations (C-Ops) is part of Health Safety Sustainability Security and Compliance (HSSSC) organization within Amazon. C-Ops ensures that Amazon transactions satisfy legal and safety requirements in compliance with guidelines set by regulatory bodies. We coordinate with aspects of identifying the risk involved in handling a hazardous product while storage and transport and classifying products with appropriate hazmat attributes. This team also review aspects of product transactions that are regulated (distribution, shipping, sale, and import/export). This involves analyzing product import documentation. We focus on product testing, certification, and regulatory permitting to ensure customer safety and protect Amazon in a constantly changing global environment.

As a Data Engineer, you should be an expert in the architecture of DW solutions for the Enterprise using multiple platforms. You should excel in the design, creation, management, and business use of extremely large datasets. You should have excellent business and communication skills to be able to work with business analysts and engineers to determine how best to design the data warehouse for reporting and analytics. You will be responsible for designing and implementing scalable ETL processes in the data warehouse platform to support the rapidly growing and dynamic business demand for data, and use it to deliver the data as service which will have an immediate influence on day-to-day decision making. You should have the ability to develop and tune SQL to provide optimized solutions to the business.

Desired profile



BASIC QUALIFICATIONS

· Experience writing high quality, maintainable SQL on large datasets.
· Ability to write code in Python, Ruby, Scala or other platform-related Big data technology.
· Expertise in Star Schema data modelling
· Exposure/Experience in Big data Technologies (hadoop, spark, etc.).
· Strong analytical and problem solving skills
· Expertise in the design, creation and management of large datasets/data models
· Experience working on building /optimizing logical data model and data pipelines while delivering high data quality solutions that are testable and adhere to SLAs
· Experience with AWS services including S3, Redshift, EMR and RDS
· Excellent verbal and written communication skills
· Ability to work with business owners to define key business requirements and convert to technical specifications

Make every future a success.
  • Job directory
  • Business directory