Epicareer Might not Working Properly
Learn More

Data Analyst (ETL)

Salary undisclosed

Checking job availability...

Original
Simplified

Job Description :

  • Proficiency in at least one of the programming languages Python, Scala or Java.
  • Strong understanding of big data and related technologies like Flink, Spark, Airflow, Kafka etc.
  • Experience with different databases – NoSQL, Columnar, Relational.
  • You have a hunger for consuming data, new data technologies, and discovering new and innovative solutions to the company's data needs
  • You are organized, insightful and can communicate your observations well, both written and verbally to your stakeholders to share updates and coordinate the development of data pipelines

Roles and responsibilities :

  • Developing scalable and reliable ETL pipelines and processes to ingest data from a large number and variety of data sources
  • Developing a deep understanding of real-time data productions availability to inform on the real time metric definitions
  • Maintaining and optimizing the performance of our data analytics infrastructure to ensure accurate, re‐liable and timely delivery of key insights for deci‐sion making
  • Design and deliver the next-gen data lifecycle management suite of tools/frameworks, including in‐gestion and consumption on the top of the data lake to support real-time, API-based and

Job Description :

  • Proficiency in at least one of the programming languages Python, Scala or Java.
  • Strong understanding of big data and related technologies like Flink, Spark, Airflow, Kafka etc.
  • Experience with different databases – NoSQL, Columnar, Relational.
  • You have a hunger for consuming data, new data technologies, and discovering new and innovative solutions to the company's data needs
  • You are organized, insightful and can communicate your observations well, both written and verbally to your stakeholders to share updates and coordinate the development of data pipelines

Roles and responsibilities :

  • Developing scalable and reliable ETL pipelines and processes to ingest data from a large number and variety of data sources
  • Developing a deep understanding of real-time data productions availability to inform on the real time metric definitions
  • Maintaining and optimizing the performance of our data analytics infrastructure to ensure accurate, re‐liable and timely delivery of key insights for deci‐sion making
  • Design and deliver the next-gen data lifecycle management suite of tools/frameworks, including in‐gestion and consumption on the top of the data lake to support real-time, API-based and