AWS Data Engineer

Full Time
  • Full Time
  • India

Website CosMicIT CosMic IT

Find Your Dream Job Here

Hello Everyone,

We at #CosMicIT are looking for #AWS Data Engineer

Locations: India

Job Description :

  • Roles and responsibilities (Total experience greater than 4 years)

    1. Building and maintaining data pipelines: You will be responsible for designing, building, and maintaining data pipelines that move data from various sources into a data warehouse or data lake.

    2. Data modeling and database design: You will design and implement data models that support the needs of data scientists and other stakeholders.

    3. Data integration: You will integrate data from various sources, including structured and unstructured data, and ensure that the data is clean, accurate, and consistent.

    4. Data quality and governance: You will be responsible for ensuring that data is of high quality and meets the needs of stakeholders. You will also ensure that data is governed appropriately and complies with relevant regulations.

    5. Performance tuning and optimization: You will optimize data pipelines and databases to ensure that they perform efficiently and meet the needs of data scientists and other stakeholders.

    6. Collaboration with data scientists: You will work closely with data scientists to understand their needs and provide them with the data they need to perform their analyses.

    7. Continuous learning and improvement: You will stay up-to-date with the latest technologies and best practices in data engineering and continuously improve your skills and knowledge.

    Technical Skills

    • Mandatory:

    Apache Spark / AWS Glue: Extensive experience working with Spark and AWS Glue for data processing and ETL workflows.

    AWS: In-depth knowledge of AWS services, particularly S3, Glue, and Redshift / Redshift Spectrum, to build scalable and efficient data solutions.

    Datalakes: Proficiency in working with data lakes, utilizing Parquet or Iceberg tables for efficient storage and retrieval.

    Python: Strong scripting skills in Python to create custom data processing scripts.

    Terraform: Familiarity with Terraform for infrastructure provisioning and management.

    • Develop new CICD workflows with CircleCI or GitHub Actions.

    Good to have:

    Datadog: Experience with Datadog for monitoring and observability.

    Liquibase: Knowledge of Liquibase for database schema versioning and management.

    Kubernetes / Argo Workflow: Familiarity with Kubernetes and Argo Workflow for container orchestration and workflow management.

    Data Quality with Deequ: Proficiency in ensuring data quality using Deequ for data validation and anomaly detection.

    Alation: Understanding of Alation for data cataloging and collaboration.

If any of these openings sound familiar to you or any of your known networks, please share the resume/CV to

Any references would also be accepted.


CosMicIT GmbH, Germany 🇩🇪

CosMicIT Informatics India Pvt Ltd. 🇮🇳

CosMicIT Spolka Z Ograniczona Odpowiedzialnoscia, Poland |

#resume #connections #jobopening #hiring #jobseekers #jobs #recruitment #jobsearch #job #hr #recruiting #references #recruiters #opentonetwork #hiring #jobopening #experience #CosMicIT #indiajobs #india

To apply for this job email your details to