Overview
|
Job-type |
Full-Time |
|
Job Category |
All |
|
Industries |
IT |
|
Salary |
IDR 20,000,000
- 30,000,000
/Month
|
Who you'll be working for
An Asia-focused digital transformation firm delivering enterprise solutions in Microsoft Dynamics 365, cloud migration, advanced data analytics, and AI. It helps organizations optimize operations, accelerate growth, and innovate through cutting-edge Microsoft technologies.
What requirements you'll need to be eligible
- Hands-on experience with Fabric Lakehouse, Warehouse, Data Pipelines, Dataflow Gen2.
- Knowledge of OneLake file/table structures.
- Experience with Power BI (DAX, modeling, optimization).
- Practical experience operating Databricks in production.
- Strong Spark/PySpark development.
- Experience with Delta Lake, DLT, Unity Catalog.
- Strong SQL and solid data modeling skills.
- Experience with cloud platforms (Azure preferred; AWS/GCP acceptable).
- Understanding of modern data architecture (Lakehouse, ELT, Streaming).
- Microsoft Fabric certifications (DP-203, DP-500) OR Databricks certifications.
- Experience with Git/DevOps workflows.
- Knowledge of Datamarts, KQL DB, Event Streams, Direct Lake.
- Experience with Databricks SQL or Photon optimization
- MLflow model versioning and deployment.
- Experience with Feature Store / ML pipelines.
What you'll be doing on the job
- Design and build enterprise data platforms using Microsoft Fabric or Databricks.
- Develop and operate Fabric Lakehouse / Warehouse / OneLake OR Delta Lake tables (batch + streaming).
- Build and maintain data pipelines using Fabric Data Pipelines, Dataflow Gen2, ADF OR Databricks Workflows, Delta Live Tables.
- Implement real-time data processing using Event Streams / Real-Time Hub OR streaming tools such as Spark Structured Streaming, Kafka, Kinesis, etc.
- Develop ETL/ELT pipelines using Python, Spark, or T-SQL.
- Build batch and real-time data pipelines using Apache Spark / PySpark.
- Design dimensional models (Star/Snowflake) for enterprise analytics.
- Build and optimize semantic models for Power BI (Direct Lake, Import, or DAX-optimized models).
- Manage data security, governance, and permissions using Fabric workspaces / OneLake OR Unity Catalog.
- Collaborate with business stakeholders to translate requirements into scalable data models.
- Optimize performance for large-scale data processing workloads.
- Implement MLOps and feature pipelines using MLflow.
- Operate Fabric or Databricks in production environments.
- Support Power BI environment administration and optimization.
Consultant Contact
Sound interesting?
Apply!