The CoinDCX Journey: Building the Future of Finance At CoinDCX, our mission is clear - to make crypto and blockchain accessible to every Indian and enable them to participate in the future of finance.
As India’s first crypto unicorn valued at $2.45B, we are reshaping the financial ecosystem by building safe, transparent, and scalable products that power adoption at scale.
We believe that change starts together. It begins with bold ideas, relentless execution and people who want to build what’s next.
If you’re driven by purpose and thrive in environments where your work defines the next chapter of an industry, you’ll feel right at home here.
About The Role We are hiring an SDE-1 Data Engineer (Individual Contributor) to execute high-quality data engineering work across ingestion pipelines, data quality, monitoring, curated datasets, and heavy third-party/vendor integrations.
This role is pure execution: writing code, fixing issues, adding validations, and ensuring reliable, timely data delivery.
You will work hands-on with Spark, Databricks, Python, Kafka, AWS (S3/EC2/Lambda) and internal CDC + ingestion frameworks.
What You’ll Do
Build & Enhance Data Pipelines (Internal + External Ingestion)
Develop ingestion pipelines for internal data (CDC, service DBs).
Build and maintain ingestion from external vendors & third parties including: Custody providers, Trading partners (TPE), Banking partners, External APIs (REST-based integrations)
Handle pagination, rate limits, incremental loads, retries, and backoffs.
Implement Spark-based transformations on Databricks.
Implement Data Quality Checks
Add schema validations, field-level checks, null/boundary checks.
Maintain ≥99% data quality for assigned datasets.
Quickly identify & fix data mismatches caused by source/vendor changes.
Monitoring, Alerts & Observability
Configure alerts for: Freshness, Latency, Data quality, Pipeline failures
Add logs and metrics to improve troubleshooting.
Ensure MTTR < 4 hours for failures.
Vendor & Third-Party Data Reliability
Monitor vendor API health, schema changes, and data drift.
Add defensive coding, retries, and fallback logic for unstable third-party feeds.
Ensure <15 min data lag for assigned vendor connectors (where applicable).
Keep documentation updated for all vendor integrations.
Curated Datasets Execution
Build/modify curated datasets under guidance.
Ensure 0 metric mismatches and correct business logic.
Maintain documentation for dataset transformations.
Engineering Discipline & On-Time Delivery
Maintain >80% test coverage and 0 PR hygiene rejections.
Execute tasks with 95%+ on-time delivery.
Provide crisp updates with minimal follow-ups.
You’ll Excel in This Role If You Must-Have
Good-to-Have
You’ll Know You’re Winning When
External vendor integrations functioning reliably (
≥99% DQ on all assigned datasets
Full alerting coverage on all owned pipelines
MTTR consistently
95%+ tasks delivered on-time,
Why This Role Matters & What’s In It For You
Hiring Process Here’s what your journey with us looks like:
Where We Work We believe the best ideas emerge when people build together. Collaboration, speed and trust come alive when teams share the same space.
With this belief, we operate as a work-from-office organisation. This role is based out of our (Bengaluru) office, where energy, alignment and innovation move in real time.
Perks That Empower You We believe great people deserve great experiences.
Ready to Build What’s Next? If you’re looking for a role that gives you direct access to high-stakes decisions, deep impact and a chance to build the future of finance, this is it.
Join CoinDCX and help us make crypto accessible to every Indian, together.
ATS Match is available
1) Upload your resume. 2) Open any job and click Check ATS Match to see your fit score.
Sign in to check your resume match