ETL Engineer

Job Location US-MA-Medford
Regular Full-Time


Reporting to the Director of Data Integration and Services, ERT is looking for a Data Engineer who is passionate about data and unlocking data value for ERT’s innovative data products. This individual is expected to be a key contributor on the ERT Data Platform team assisting in the design, implementation and support of data solutions and systems including integration with Operational Data Stores, Master Data Management and Data Lake in the AWS cloud.  This candidate must have strong analytical skills and ability to be creative and think outside the box. This individual must have excellent business communication skills and must be able to work with operations teams as a data expert and advisor to help overcome obstacles and design the right data solutions. This position will play an important role in extending the ERT data platform architecture, producing high quality deliverables, testing code, providing guidance to test teams and demonstrating code ownership related to DevOps and Monitoring. This is an opportunity to be part of an innovative engineering team with a learning culture, where teamwork and collaboration are encouraged, excellence is rewarded, and diversity is respected and valued.


  1. Implement, and support a platform providing secured access sensitive clinical data within ERT and directly to customers.
  2. Interface with operational delivery teams, gathering information and delivering complete solutions.
  3. Model data and metadata to support ad-hoc and pre-built data analysis.
  4. Provide subject matter expertise and advice to data consumers surrounding ERT’s data pipelines supporting ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions.
  5. Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation.
  6. Tune application and query performance using profiling tools and SQL.
  7. Analyze and solve problems at their root, stepping back to understand the broader context.
  8. Learn and understand a broad range of Amazon AWS data resources and know when, how, and which to use and which not to use.
  9. Keep up to date with advances in big data technologies and run pilots to design the data architecture to scale with the increased data volume using AWS.
  10. Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for datasets.
  11. Triage many possible courses of action in a high-ambiguity environment, making use of both quantitative analysis and business judgment.
  12. An ability to work in a fast-paced environment where continuous innovation is occurring and ambiguity is the norm.




  1. Keeps current with applicable Standard Operating Procedures and associative training.
  2. Basic knowledge of consultative/customer focus
  3. Basic knowledge of Thinking Skills
  4. Basic knowledge of Organizational Awareness
  5. Basic knowledge of Interpersonal Relations
  6. Basic knowledge of Communication
  7. Basic knowledge of Project Management


The duties and responsibilities listed in this job description represent the major responsibilities of the position.  Other duties and responsibilities may be assigned, as required.  ERT reserves the right to amend or change this job description to meet the needs of ERT.  This job description and any attachments do not constitute or represent a contract.



  • Bachelor's degree in a Computer Science or related field.
  • Candidate must be located within commuting distance of Boston, MAor be willing to relocate to the area.




  • At least 3 years of experience building data pipelines.
  • At least 3 years experience in dimensional modelling, ETL development and/or Data Warehousing.
  • At least 4 years of database programming experience.
  • At least 3 years of experience in Python, Go and/or Java.
  • Knowledge of data ingestion and data cataloguing experience in AWS Kinesis (or Kafka), Glue, Athena.
  • At least 2 years of experience in AWS databases including Aurora, Dynamo DB and/or RDS Oracle.
  • Strong hands on experience in SQL.
  • Experience in working with different data formats  - json, XML, parquet
  • At least 3 years of experience in software development life cycle.
  • At least 3 years of experience in Project life cycle activities on development and maintenance projects.
  • At least 3 years of experience in Design and architecture review.
  • Ability to work in team in diverse/ multiple stakeholder environment
  • Strong Analytical skills
  • Experience and desire to work in a Global environment

We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law.


Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed

Connect With Us!

Not ready to apply? Connect with us for general consideration.