Senior Data Engineer

Experience Required:

Job Title: Senior Data Engineer

Location: Infopark Phase 2, Cochin

 Job Type: Full Time

Experience Required: 3+ Years

Qualification:BCA / BTech / MTech / Similar

Immediate Joiner Preferred

Who are we?

We are a fast-growing AI, Data Science, Data Engineering, Clinical Decision support, Information & Cyber Security compliance focused organization with a presence in Kochi, UK and US. We aspire to be a prominent player in the Decision Making in the coming years. A team of 150 highly motivated employees are our strength.

What we offer?

A high growth environment where you can take new challenges, build a career, make an impact. Cross cultural, multi national exposure, access to experienced leaders and mentors. Competitive compensation and benefits.

Roles & Responsibilities:

  • Design, develop, and maintain scalable ETL/ELT data pipelines.
  • Build and manage data lake and data warehouse solutions.
  • Optimize and automate ingestion, transformation, and delivery of structured and unstructured data.
  • Implement data quality, lineage, and governance best practices.
  • Collaborate with Data Science, Visualization, and Engineering teams to operationalize data solutions. 
  • Monitor, debug, and improve data platform performance 

Primary Skills:

  • Advanced Python, Advanced SQL, Relational/Document Databases, Batch and Stream processing, Data pipeline monitoring and management, handling structured and unstructured data, and transformations.
  • Having basic ML skills is good to have.
  • Understanding of data modeling, data governance, and CI/CD practices.
  • Solid experience in debugging and optimizing high-volume data processing.
  • Hands-on experience with modern data stack Apache Spark, Airflow, dbt, Kafka, or equivalent technologies

Must have: 

AWS (or any cloud provider), Hands-on Ubuntu Linux, Bash Shell, Developing with Docker

Secondary Skills:

  • Communication, a good understanding of project management, and software development workflow. 
  • Experience with containerization (e.g., Docker, Kubernetes) and orchestration (Airflow, Dagster, Cron, etc.). 
  • Familiarity with ML Ops or Data Science workflows. 
  • Knowledge of streaming/batch/on-demand processing data architecture. 
  • Exposure to domain-specific analytics (e.g., retail, healthcare, finance). 
  • Data lake and warehouse technologies (e.g., Databricks, Snowflake, Redshift, BigQuery, Delta Lake). 

Why Join Us?

  • Opportunity to work with cutting-edge technologies and innovative projects.
  • Collaborative and dynamic work environment.
  • Career growth and professional development opportunities.
  • Competitive salary and benefits package.

Apply Here

Please fill in all required fields below.