Amazon EU Sàrl.
il y a 2j
source :


  • Design and lead reviews for efficient data models using industry best practices and metadata for ad hoc and pre-built reporting
  • Design, build and own all the components of a high volume data warehouse end to end.
  • Interface with business customers, gathering requirements and delivering complete data & reporting solutions owning the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc.
  • to drive key business decisions

  • Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers
  • Interface with other technology teams to extract, transform, and load (ETL) data from a wide variety of data sources
  • Own the functional and nonfunctional scaling of software systems in your ownership area.
  • Stakeholder management and status reporting
  • Identifying new opportunities in partnership with business partners

    Bachelors in Computer Science, Engineering, Statistics, Mathematics or related field

  • 10-12 years of experience in data engineering / business intelligence space
  • Curious, self-motivated & a self-starter with a can do attitude’. Comfortable working in fast paced dynamic environment.
  • Strong understanding of ETL concepts and experience building them with large-scale, complex datasets using traditional or map reduce batch mechanism.
  • Strong data modelling skills with solid knowledge of various industry standards such as dimensional modelling, star schemas etc
  • Extremely proficient in writing performant SQL working with large data volumes
  • Experience designing and operating very large Data Warehouses
  • Experience with at least one of scripting languages(e.g., UNIX Shell scripting, Python, Perl, Ruby).
  • Good to have experience working on AWS stack
  • Clear thinker with superb problem-solving skills to prioritize and stay focused on big needle movers

    Master’s degree in Computer Science, Information Systems, Mathematics or related discipline

  • Strong knowledge of one or more scripting language (Python, Perl)
  • Strong analytical skills with excellent knowledge of Oracle, SQL, and PL / SQL.
  • Expert understanding of ETL techniques and best practices to handle extremely large volume of data
  • Experience with AWS using S3, EC2, Redshift, Aurora, Lambda, QuickSight, etc.
  • Experience in data ingestions techniques for batch and stream processing using AWS Batch, AWS Kinesis, AWS Data Pipeline
  • Experience in AWS Big Data technologies such EMR, Glue, Athena, Redshift Spectrum
  • Strong knowledge of machine learning, data mining, and predictive modeling
  • Ability to handle multiple, competing priorities in a fast-paced environment
  • Work well in teams, respecting ideas from teammates, business partners, and technical experts
  • Experience in maintaining data warehouse systems and working with large scale data transformations using Hadoop, Hive, Spark, or other Big Data technologies
  • Strong customer focus, ownership, and drive to get things done
  • AWS certifications or other related professional technical certifications
  • Postuler
    Ajouter aux favoris
    Retirer des favoris
    Mon email
    En cliquant sur « Continuer », je consens au traitement de mes données et à recevoir des alertes email, tel que détaillé dans la Politique de confidentialité de neuvoo. Je peux retirer mon consentement ou me désinscrire à tout moment.
    Formulaire de candidature