Key Responsibilities
Requirement Analysis: Collaborate with stakeholders to understand business
requirements and data sources, and define the architecture and design of data
engineering models to meet these requirements.
Architecture Design: Design scalable, reliable, and efficient data engineering models,
including algorithms, data pipelines, and data processing systems, to support business
requirements and quantitative analysis.
Technology Selection: Evaluate using POCs and recommend appropriate technologies,
frameworks, and tools for building and managing data engineering models, considering
factors like performance, scalability, and cost-effectiveness.
Data Processing: Develop and implement data processing logic, including data cleansing,
transformation, and aggregation, using technologies such as AWS Glue, Batch, Lambda.
Quantitative Analysis: Collaborate with data scientists and analysts to develop algorithms
and models for quantitative analysis, using techniques such as regression analysis,
clustering, and predictive modeling.
Model Evaluation: Evaluate the performance of data engineering models using metrics
and validation techniques, and iterate on models to improve their accuracy and
effectiveness.
Data Visualization: Create visualizations of data and model outputs to communicate
insights and findings to stakeholders.
Data Engineering: Understanding of Data engineering principles and practices, including
data ingestion, processing, transformation, and storage, using tools and technologies
such as AWS Glue, Batch, Lambda.
Quantitative Analysis: Proficiency in quantitative analysis techniques, including statistical
modeling, machine learning, and data mining, with experience in implementing
algorithms for regression analysis, clustering, classification, and predictive modeling.
Programming Languages: Proficiency in programming languages commonly used in data
engineering and quantitative analysis, such as Python, R, Java, or Scala,