Databricks Architect

Databricks Architect

Tredence Inc. | Bangalore, KA, IN

Posted 11 days ago

Apply Now



Tredence is a global analytics services and solutions company. We are one of the fastest growing private companies in the country for three straight years according to the Inc. 5000 and we continue to set ourselves apart from our competitors by attracting the greatest talent in the data analytics and data science space. Our capabilities range from Data Visualization, Data Management to Advanced analytics, Big Data and Machine Learning. Our uniqueness is in building Scalable Big Data Solutions on Onprem/GCP/Azure cloud in a very cost effective and easily scalable manner for our clients. We also come in with some strong IP and pre-built analytics solutions in data mining, BI and Big Data.

Job Description

In this rapidly evolving world, Tredence is always looking for new ways to disrupt the status quo, go to market faster, and optimize customer experiences. We are seeking an experienced product engineer to lead the charge on this mission. The ideal candidate will have thorough experience in executing ML and DS solutions and managing and collaborating with a team of data engineers, data scientists and software engineers. They will foster an environment of collaboration and creativity. Most importantly, they will be integral to helping us build better products that people [or businesses] love to use.

We are looking for candidates to join across our office locations- Bangalore, Chennai, Pune, Gurgaon & Kolkata

Primary Roles and Responsibilities:

● Designing and Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack

● Ability to provide solutions that are forward-thinking in data engineering and analytics space

● Collaborate with DW/BI leads to understand new ETL pipeline development requirements.

● Triage issues to find gaps in existing pipelines and fix the issues

● Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs

● Help joiner team members to resolve issues and technical challenges.

● Drive technical discussion with client architect and team members

● Orchestrate the data pipelines in scheduler via Airflow

Skills and Qualifications:

● Bachelor's and/or master’s degree in computer science or equivalent experience.

● Must have total 8+ yrs. of IT experience and 2+ years' experience in Data warehouse/ETL projects.

● Deep understanding of Star and Snowflake dimensional modelling.

● Strong knowledge of Data Management principles

● Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture

● Should have hands-on experience in SQL, Python and Spark (PySpark)

● Candidate must have experience in AWS/ Azure stack

● Desirable to have ETL with batch and streaming (Kinesis).

● Experience in building ETL / data warehouse transformation processes

● Experience with streaming data / event-based data

● Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala)

● Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J)

● Experience working with structured and unstructured data including imaging & geospatial data.

● ● Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot

● Databricks Certified Data Engineer Associate/Professional Certification (Desirable).

● Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects

● Should have experience working in Agile methodology

● Strong verbal and written communication skills.

● Strong analytical and problem-solving skills with a high attention to detail.

Tredence is an equal opportunity employer. We celebrate and support diversity and are committed to creating an inclusive environment for all employees.

  • Visit our website for more details: